How will our lives look like in 2025? With the advancement of numerous revolutionary technologies, our world is facing unprecedented innovations. One of the most important technologies leading the changes is self-driving. The development of self-driving technology would not only improve the quality of life of people and but also bring fundamental changes to our society.
In this project, first, I assumed the future that self-driving technology has been fully developed. And then I imagined the social aspects of 2025 and defined the new role of a self-driving vehicle in the future society. Based on the expected future social aspects, I designed an automotive and smart wall application service: E-Connect that can create new value in the future self-driving era.
E-Connect service is an infotainment system for a self-driving vehicle. The service keeps tracking driver's health conditions in real-time, provides customized contents based the bio-sensing data, and helps a driver in an emergency situation by working mutually with the smart wall-based application used by an emergency technician of the 911 report center.
Internship Project at Amoeba UX
Jan 2018 - Feb 2018
My Role - UX / UI Designer, UX Researcher
Performed user research, design ideation, UX workflow, UI design.
" How would the user experience of the future self-driving vehicle be like? "
Envisioning Lifestyle in 2025
At the beginning of the project, I needed to envision and conceptualize the lifestyle of the future society in 2025 to understand and draw out what would be the design opportunities in the future autonomous vehicle. As a way to achieve the goal, I conducted literature review research and organized the results into two categories: Social and Technology. I then analyzed the collected data of each category, listed out 3 notable points, and came up with the combined insight for each.
The increase of one-person households and the personalization of the society
- ForMe, Staycation, 1conomy
- Sharing Economy, Social Dining
Relationships-focused spending patterns to overcome loneliness
New Consumption Pattern
In 2032, population growth will stop and begin to decline
- Reduction of working age and youth population
Development of Display Technology
The advent of new technologies such as Flexible displays, Smart Mirrors
- Display installation constraints disappear
Connect Everything : C - DAS
Innovative communication tech that can lead the future smart-connected world
- Connect and control all the digital device
Key of the Future Society : AI
Deep Learning tech will change our future lifestyles and industries
- Innovation in education, finance, health care, etc.
Expected Future Societal Keywords
Based on the research, I have drawn out the seven expected future societal keywords in 2025.
Expected Social Aspects
I have imagined the expected social aspects of 2025 with the derived keywords. Among the aspects, I have focused on three Keywords, namely Alone Together, Space Blur, and Thread of Connections. Through this process, I was able to predict the appearance of our society in 2025.
In social terms, first, due to the realization of fully developed self-driving technology, the ownership rate of vehicles would be reduced drastically. Second, the growth of the sharing economy would make people's consumption patterns change from purchasing to sharing. Lastly, the beginning of a super-aged society would result in the reduction of young generations, and increasing demands for health care services.
In technological terms, first, AI technology would be utilized as a personal assistant that analyzes and manages the everyday lives of people. Second, the development of big data technologies would enhance the quantity and quality of information. Lastly, as our world becomes a super-aging society, the importance of bio-sensing technology that can keep track of people's health conditions would increase.
2. Design Ideation
Defining the role of vehicle in 2025
From the expected social aspects I have drawn out, I was able to define what roles would a self-driving vehicle in 2025 perform.
Based on the insights and defined role of a self-driving vehicle from the previous research steps, I performed affinity diagramming exercise to group the insights and spot design opportunities in a future vehicle. As I continued grouping the insights, it became obvious that there would be a lot of new design spaces as our society becoming super-aged and highly personalized and sharing economy grows. In addition, I captured that, within those changes, the growth of AI, big-data, and bio-sensing technology would play a key role in future society. After spotting societal and technological traits, I started synthesizing these insights to find design opportunities in a future self-driving vehicle.
3. Design Prototype
E-Connect service has two different displays that are working together. Automotive display is where drivers or passengers of a self-driving vehicle interact with. Smartwall Display is where emergency technicians interact with. Users of these two displays work mutually to handle emergency situations. Below are the detail description and information architecture of each display.
With the development of technology, future vehicle displays are evolving into an integrated display that combines center fascia, clusters, and AVN. Hence, I utilized integrated shape for the automotive display and defined a user and a role of the display. Then I developed the information architecture of the infotainment system of the automotive display.
Smartwall display would be an essential part of our future lifestyles. In this project, I focused on the 911 emergency report center's smart wall display which is usually positioned in front of emergency technicians. To make the details of the display clear, first, I defined a user and the role of the display. Based on the definition, I developed the information architecture of the display.
UI Design Concept
Color / Font
We started building the initial wireframes to visualize the information architecture above. I defined areas of contents in the new type of display and designed the UI in a way that enables users to easily manipulate infotainments and experience contents. While exploring the designs, I continuously had usability testing sessions with Amoeba UX designers and users to get better improvement points. Based on their feedback, I kept iterating the wireframes to have more simple and intuitive usability.
Bio-Sensing Infotainment System
Autonomous Vehicle UI
The infotainment system of E-Connect service is composed of two respective areas: Cluster area and Contents area. In the Cluster area, there are functions including switching to Howling menu, setting destination, indicator, navigation map, and area convert. When the fully developed self-driving technology era comes, there will be no distinction between the purpose of respective seats. Therefore, E-Connect service provides area convert function that lets users change the respective areas depending on the seat that they are sitting. In the Contents area, users can choose contents they want to use and back to home screen.
When entering Howling Menu, recommended contents will be displayed according to a user's daily-mood that continuously collected by the bio-data sensors which is attached to car seats.
Music contents will be automatically recommended according to the daily mood of a user. Mood Equalizer shows an emotional state of a user that reacts to musics in real-time.
Users also can check their emotional-state details. E-Connect provides various information including Daily Mood Detail, Weekly Mood, and the statistical graph of Mood, Fatigue, Emotion, and Stress.
Legibility is one of the most important elements in the infotainment system for an autonomous-driving vehicle. Therefore, I carried out numerous rounds of cluster design iterations to find optimal legibility and visual look.
When E-Connect senses health emergency situations of a passenger, a pop-up screen will appear to alert the current physical conditions and emergency details, and automatically change the mode from Driving to Emergency if a passenger doesn't react to the alert.
CONNECTING TO 911
After entering an emergency mode, E-CONNECT searches for the nearest hospital and automatically connects to a 911 emergency technician. At the same time, the physical condition of the passenger will also be displayed and monitored in real-time.
After connecting to 911, the emergency technician will continuously communicate with the patient, identify current physical status, and provide first aid methods.
The most important task of 911 Safety Report Center is proper management and rapid early-stage actions against numerous emergency situations. To improve the report managing system of 911 Report Center, I designed a new UI that can help emergency technicians handle various information related to emergency reports effectively and comprehensively.
Grouping functions that are mostly used:
Report records, Current traffic information, Regional reports, and First-aid methods.
Real-time report areas will be marked on the map by alerting Icons, and a user can check other report areas by using the location bar.
Users can check the number of reports of the day by a graph.
Monthly Briefing provides the statistical information of weekly and monthly reports, and regional details.
When 911 receives an emergency signal from a vehicle, the screen will automatically be changed and show the patient's condition, current location, distance to the hospital, and current situations of the nearest hospitals. This feature would let technicians take rapid early-stage actions to emergencies.
When an emergency occurs, the navigation map shows the patient's current location and distance from the hospital, and find the fastest way.
By getting the bio-data of a patient from a vehicle, the screen provides patient's current physical conditions in real-time
To communicate and identify the current status of a patient, the service connects a video-call with the patient.
Showing various information including the name of a hospital, remaining time to get there, the number of waiting patients, and the doctor in charge.
The emergency technician keeps monitoring the patient's current physical condition and provides proper first aid treatments based on the bio-sensing data and communication with the patient.
After the connection, the technician will communicate with the patient and check the current physical condition.
After the connection, the service shows more detailed physical conditions of patient.
Providing the most proper first-aid procedures after checking the patient's physical status.
Hospital, map and arrival information will continuously be displayed until the patient’s arrival at the hospital.