Charlie: Personal Navigation Assistant


Charlie is an intelligent personal navigation assistant with three distinct driver modes with multimodal capabilities. You can activate Charlie by speaking or tapping on the steering wheel twice. Our team identified modalities in which current navigation systems and found that most were visual heavy. We sought to understand the relationship between the navigation system and the cognitive load of the driver. From there, we set out to ideate and conceptualize an alternative means of mainly visual navigation.


1.5 Weeks

Team Members

Eugene Meng

Mai Bouchet

Marina Lazarevic

Siyi Kou

Shravya Neeruganti




Director and Editor



From our research we found that the majority of navigation systems today rely heavily on visual components. This is extremely dangerous as this kind of navigation tends to increase the cognitive load on the driver and therefore attention is taken away from actually driving and placed somewhere else. We found from our research that this division of attention has proved to be fatal.

Problem Space

"When drivers left their route intentionally, navigation systems produced 56% false accoustics and 65% false visual messages respectively."

1. Current navigation applications do not adapt well to changes. For example, if a driver chooses to deviate from a route, he/she will receive insufficient feedback or cryptic, unhelpful commands.

2. Communicating intentions during a trip is difficult. Often it results in drivers needing to fumble with a touchscreen while driving which is dangerous.

3. Information exchange loops are inadequate and existing navigation improvements do not focus enough on driver-device interactivity.


After conducting secondary research and understanding our problem space well, we found that the most promising navigation system would adapt to driver behavior and changes integrating multimodal capabilities such as tactile, audible, or visual to enhance voice interactions. Our team set out to find out how other sense modalities could be tapped into for our navigation system. Our team began to brainstorm ideas to explore how someone would interact with a navigation system.

Individual Ideation

In the beginning, we practiced individual ideation on what features could be included into Charlie, and then congregated to share, filter and merge our intial concepts, as well as to spark new ideas. We examined in-car navigation from 2 perspectives: input and output, and summarized the human-device interactions and corresponding issues of the process in the following chart.

Team Ideation

Our team conducted a brainstorming session to generate many possible use cases and solutions that go along with them. Brainstorming is a technique that was invented by Alex Osborn in 1938. It is best to conduct this method in a group of 3-9 people. There are no right or wrong answers and at this point in the process so we aimed for quantity and not quality.


After the ideation phase, we identified 3 ways Charlie could help the driver reach his/her destination by combining different multimodal elements to develop a seamless interaction between Charlie, the driver and even a passenger seated in the car.

3 Navigation Modes

1. Guided Mode - Leader

Current navigation applications speak a mechanical language and gauge the road by numbers. The information is absurd for drivers who lack contextual awareness of the road. In guided mode, Charlie offers navigation information in conversational language and develops a customized language set for different users. In order to prevent errors and make accurate decisions, Charlie will notify the user when she does not understand the input or the road condition, and will prompt the user to act accordingly.

2. Learning Mode - Follower

In real-life driving scenarios, people often want to take a detour or their preferred routes different from what recommended by their navigation. Learning mode features background machine learning when Charlie is muted to follow the lead of the user. In this mode, Charlie learns the user's preferred route and driving habits, and applies those to her future work.

3. Assisted Mode - Helper

In assisted mode, a passenger is guiding the driver with visual guidance offered by Charlie while the default setting of Charlie is mute. When the passenger is distracted, by double-tapping or calling Charlies name, the driver can re-activate Charlie to acquire the guidance for the next step. Learning mode can work concurrently with this mode for Charlie to pick up the language of the guiding passenger.

Final Concept

Next Steps

Our team would like to test Charlie's intelligence and functionality. We would create a behavioral prototype to understand if Charlie is giving useful and intuitive information to the driver. This would be a crucial step in developing the navigation system.

Visual navigation systems can give cryptic feedback and increase cognitive load which can prove to be fatal in some cases

By incorporating those you trust in the navigation process, Charlie can indirectly empower drivers to make safer decisions

Charlie can pick up on contextual cues and can then provide actionable feedback based on what you need in real-time