P90385570_highRes_the-new-bmw-communic

At a glance

As part of an advanced R&D team with an iterative working model, I’ve been contributing in an inter-disciplinary team developing in-vehicle HMI/HCI experiences and product innovations by creating top-notch interactive hardware/software prototypes.

Team

Experience Design Team

Date

Sep 2021 - Jan 2022
bmw0
bmw12
During my time at BMW, I worked on generating an In-car AI feature enabling a car's AI assistant to deliver a personalized in-car experience to the driver. The project is under NDA so I am unable to share the details of my work but I built this project based on what I worked on at BMW by designing a new CID UI visual concept.
AI
HOW DOES OUR FEATURE WORKS?
Using A Drivers' Personal Object As An Action Trigger.
A built-in camera with the machine learning capabilities in a car will detect and identifies drivers' personal object, and it will pull dataset from DB record and provide AI suggestions. Everything will be done by ML Model automatically. No user input required.
UX FLOWCHART
Visual Copy
A NEW CID UI DESIGN CONCEPT
Using a clear layering system. where most of the information lives in clear cards, and windows, separated by a lot of negative space. With the minimal design philosophy, there are 3D elements only where it brings value. Also, the 3D models of the car are used to show information. 


Group 142
The vehicle visualization wtih a 3D model in Blender 3D

Created the vehicle visualization wtih a 3D model in Blender 3D.

Blender_
UI Motion Study (Welcome State)

Tools I used

1. Blender 3D

2. After Effects

3. Adobe XD

Proof-of-Concept (POC) Studies
Developing a working prototype in Unity 3D by pulling data from Google Teachable machine into Unity using A Node.JS Web socket Library.
Google Teachable Machine

Initially I looked to Google’s Teachable machine which is a machine learning tool that trains a classification model so that it would be able to determine what’s the driver’s personal object in Unity. Google Teachable machine allows those who are not trained in the field of Machine learning to not only understand the concept, but benefit from very basic capability.

Web Socket(P5.JS)

Utilizing web sockets to send objects’ data from google teachable machine to the Unity 3d. With the Object Recognition technology, enable camera in a vehicle to detect drivers' personal object like phone.

Unity 3D

Developing a working prototype in Unity 3D by pulling data from Google Teachable machine into Unity using A Node.JS Web socket Library.

Insights
Desingers can make better design decision if they understand the working of the technology such as Machine Learning.
Machine Learning can support in-car experiences in a variety of ways, including quickly processing and interpreting the large amounts of drivers’ personal data generated by the vehicle's cameras and sensors and helping to improve drivers’ in-car CID experiences. 
Next Steps
Generate different type of Use cases.
Continue to refine the UX/UI Flow and UI Micro-Interaction.
Conduct a user-testing using Untiy Prototype.
Iterate User Interfaces based on the feedbacks from the user-testing and other evaluation methods.

Works

HGI:Hand gesture interfaceInteractive Prototyping

ChatflowUX / UI

LiebeUX / UI

hyperloopUX/UI

Neben.UX / UI

QrardboardAugmented Reality Game Development

2

I’d describe myself as a UX/UI Designer, Structural Thinker, Rapid Prototyper, Software Developer who has a genuine enthusiasm for developing from a mobile application to a web application and builds an interactive information system for people.

 

Copyright © 2018 Bryan Beomseok Oh. All rights reserved.