G11

Project background

In our real life, gestures are being used for some specific purpose, for instance, you wave by raising their hand and moving it from side to side, which means simply "hello" or “goodbye. So I wanted to implement these sort of gestures for everyday life for the hand gesture interface for mixed reality. In this way, this metaphor can make it easier to control, learn, and remember a set of gestures.

G4
G5
G6

Project outcome: 1.A set of meaningful and intuitive hand gesture library

I created a gesture library to explore hand gesture interface through the working prototypes.

Project outcome: A set of meaningful and intuitive hand gesture library

I created a gesture library to explore hand gesture interface through the working prototypes.

Categories

Project Outcome: 2.HGI(Hand Gesture Interface) for self-driving car's infotainment system

“ Send a text message with HGI in a self-driving car “

Project Outcome: HGI(Hand Gesture Interface) for self-driving car's infotainment system

“ Send a text message with HGI in a self-driving car “

G3 Copy

 Onboarding flow for sending a text message by voice input

Project Outcome: 3.HGI(Hand Gesture Interface) for virtual reality

At first, I created a virtual hand in Unity, which mimics your hand movement precisely. The hand movement using the glove renders a virtual reality to feel like a physical environment due to the precise sensor-based tracking of each joint in the hand.

 

Project Outcome: HGI(Hand Gesture Interface) for virtual reality

At first, I created a virtual hand in Unity, which mimics your hand movement precisely. The hand movement using the glove renders a virtual reality to feel like a physical environment due to the precise sensor-based tracking of each joint in the hand.

 

Virtual Hand Manipulation 

Direct hand manipulation using gesture glove in Unity

Finger Pointing & Hover States 

Interact with VR interface with the gesture glove. Hover state is an essential element when designing for VR interface, and it allows users to feel more connected with the content. When the index finger is pointing the contents, it will be highlighted.

Finger Pointing & Hover States

Interact with VR interface with the gesture glove. Hover state is an essential element when designing for VR interface, and it allows users to feel more connected with the content. When the index finger is pointing the contents, it will be highlighted.

Volume control 

Using the distance in between our thumb and index fingers to turn it volume up or down. As we intrinsically know the maximum and minimum of the span without seeing it thanks to our proprioceptive system. The reason why this gesture use thumb and index finger is that our thumb and index finger are more frequently used than other three fingers.

"The proprioceptive gesture is powerful for fine tuning interactions"

"The proprioceptive gesture is powerful for fine tuning interactions"

The distance in between our thumb and index fingers is a proprioceptive interaction, as we intrinsically know the maximum and minimum of the span without seeing it. The reason why this gesture use thumb and index finger is that our thumb and index finger are more frequently used than other three fingers.

 

For instance, the distance in between our thumb and index fingers is a proprioceptive interaction, as we intrinsically know the maximum and minimum of the span without seeing it.

HOW IT WORKS

Project outcome: A set of meaningful and intuitive hand gesture library

I created a gesture library to explore hand gesture interface through the working prototypes.

1. Integrating Arduino with Unity

Connect Arduino to Unity and make a fully working prototype.

2. Integrating Particle Photon with electron web app

The implementation of a fully working prototype for car infotainment system using Node.js and Particle Photon 

How it works (HGI for Virtual Reality)

The working prototype for Virtual Reality using Arduino and Unity.

Project outcome: A set of meaningful and intuitive hand gesture library

I created a gesture library to explore hand gesture interface through the working prototypes.

Prototype_Architecture Copy
Photon Copy
Photon Copy 2
Photon Copy 3

How it works (HGI for Car infotainment system)

The working prototype for car infotainment system using electron web application, Node.js, and Particle Photon

Project outcome: A set of meaningful and intuitive hand gesture library

I created a gesture library to explore hand gesture interface through the working prototypes.

Prototype_Architecture
Photon
Electronapp

How I got here? : Understanding the users' behavior pattern

I created a gesture library to explore hand gesture interface based on the insights from over 3 rounds of user-testing.

Project outcome: A set of meaningful and intuitive hand gesture library

I created a gesture library to explore hand gesture interface through the working prototypes.

Design intention

1. Explore everyday gestures as an integral part of our research

2. Explore The users' physical interaction with their hand and analyze it to find the pattern 

3. Compare the similarities between everyday gestures and the gestures people would choose to control their devices

4. Explore the use of hand gesture as a means of human-computer interaction and Develop Intuitive and Ergonomic Gesture Interfaces for controlling the User interface such as in-car infotainment system, AR/VR, and etc

5. Develop Hand gesture interface by mimicking familiar physical interactions with physical tools. This metaphor makes it easier to control, learn, and remember our own gesture interface.

Project outcome: A set of meaningful and intuitive hand gesture library

I created a gesture library to explore hand gesture interface through the working prototypes.

H1
H3

The key things I wanted to understand

  • What are some examples of hand gestures in everyday life?
  • Are there any common symbols or metaphors for the chosen gestures?
  • What are the most common gestures for the interface?
  • How difficult is it to image gestures for Everyday Life? (Average difficulty of imaging gestures)
  • How are hand gestures used as a means of human-computer interaction?
  • What is the median of the sensor value for the chosen gestures? (Average sensor value)
  • Analyze the sensor value which is The  Flexion  of Each  Finger by using the flexible bend sensor 

Project outcome: A set of meaningful and intuitive hand gesture library

I created a gesture library to explore hand gesture interface through the working prototypes.

H19_FINAL

Works

HGI:Hand gesture interfaceInteractive Prototyping

ChatflowUX / UI

LiebeUX / UI

hyperloopUX/UI

Neben.UX / UI

QrardboardAugmented Reality Game Development

nameBox

Get in touch.

Email 

© 2024 Bryan Oh. All Rights Reserved.