|
1 | 1 | ---
|
2 | 2 | title: "iVizLab - Research"
|
3 | 3 | layout: textlay
|
4 |
| -excerpt: "iVizLab -- Research" |
| 4 | +excerpt: "iVizLab -- Sensing Humans (Bio/Brain/Face/Movement/VR)" |
5 | 5 | sitemap: false
|
6 | 6 | permalink: /bioBrainVR/
|
7 | 7 | ---
|
8 | 8 |
|
9 | 9 |
|
10 |
| -# Bio & Brain Sensing with VR |
| 10 | +# Sensing Humans (Bio/Brain/Face/Movement/VR) |
11 | 11 |
|
12 |
| -RESEARCH :: See [Related papers](#paperSection) :: Related Projects: [Publications]({{ site.url }}{{ site.baseurl }}/research) |
| 12 | + |
| 13 | +<strong> [See Related Papers](#paperSection) and Related Projects:</strong> <br> |
| 14 | + ## [AI-based Exploration of Art and Design]({{ site.url }}{{ site.baseurl }}/ aiArts) ## [AI Cognitive Creativity)]({{ site.url }}{{ site.baseurl }}/aiCreativity) ## [AI Affective Virtual Human]({{ site.url }}{{ site.baseurl }}/virtualHumans)<br> |
| 15 | + ## [XR Avatars; Edu, Coaches, Health]({{ site.url }}{{ site.baseurl }}/xrAvatars) <br> |
13 | 16 |
|
14 | 17 | Reseachers: Steve DiPaola, Meehae Song
|
15 | 18 |
|
16 | 19 | **About:**
|
17 |
| -Our lab has extensive experience in using different sensing technology including eye tracking and facial emotion recognition (DiPaola et al 2013), as well as gesture tracking and bio sensing heart rate and EDA (Song & DiPaola, 2015) which both affect the generative system and can be used to understand the reception to the generated graphics (still, video, VR). |
| 20 | +Our lab has extensive experience in using different sensing technologies to better understand and incorporate human intent and interaction. |
| 21 | +Including eye tracking, facial emotion recognition (DiPaola et al 2013); gesture, body, &hand tracking; bio sensing - heart rate and EDA (Song & DiPaola, 2015); brain waves ( BCI), both for huamn understanding and for more human centered interaction for affect generative systems as it can be used to understand the reception to the generated graphics (still, video, VR). |
18 | 22 |
|
19 | 23 | **The Research:**
|
20 |
| -Emotional facial tracking using camera and AI software. Motion, gesture and body tracking using overhead cameras and MS Kinect. Hand tracking via our own data gloves and Leap Controller. Eye tracking via our Pupil eye tracker. Bio sensing ( heart rate and EDA) via our Empatica E4 watch. |
| 24 | +Emotional facial tracking using camera and AI software. Motion, gesture and body tracking using overhead cameras and MS Kinect. Hand tracking via our own data gloves and Leap Controller. Eye tracking via our Pupil eye tracker. Bio sensing ( heart rate and EDA) via our Empatica E4 watch. Brain wave via Muse and other systems. |
21 | 25 |
|
22 | 26 | **Setup and Results:**
|
23 | 27 | Some examples of our tracking systems. All our 2d, 3d and VR systems have an abstraction layer with software modules to support several advanced input technologies such as emotion tracking, motion tracking, and bio-sensors.
|
24 | 28 |
|
25 | 29 | <br>
|
26 | 30 |
|
| 31 | +DiPaola in our lab, demonstrating Brain and Heart Rate sensing - for health - where it is possible to control and visualize your heart and feelings. Moving from the universe, to birds flocking to your beating heart for mental health. |
27 | 32 | <iframe width="450" height="230" src="https://www.youtube.com/embed/MYZDRSRadaY?rel=0" frameborder="0" allowfullscreen></iframe>
|
| 33 | +Our work where a mental health counsellor uses our system to create (dream of) a happy place and brings via brain/breathing control the VR patient to the happy clam place she has created with her mind and breathing. For mental health; the outer sphere is her breathing in and out, the flock of birds is her brain waves (alpha waves here) |
28 | 34 | <iframe width="450" height="230" src="https://www.youtube.com/embed/8mWX9cWJolQ?rel=0" frameborder="0" allowfullscreen></iframe>
|
| 35 | +More studies with heart rate, ... (watch) and breathing via our systems and generated graphics. |
29 | 36 | <iframe width="450" height="230" src="https://www.youtube.com/embed/JaOKbKGkwVw?rel=0" frameborder="0" allowfullscreen></iframe>
|
30 | 37 | <iframe width="450" height="230" src="https://www.youtube.com/embed/0FaDEymjxbg?rel=0" frameborder="0" allowfullscreen></iframe>
|
31 | 38 | <iframe width="450" height="230" src="https://www.youtube.com/embed/7I3heXUNZ8U?rel=0" frameborder="0" allowfullscreen></iframe>
|
32 | 39 | <iframe width="450" height="230" src="https://www.youtube.com/embed/rm7iR-WvHSM?rel=0" frameborder="0" allowfullscreen></iframe>
|
| 40 | +Breath controlled art. |
| 41 | +<iframe width="450" height="230" src="https://www.youtube.com/embed/cncSjzDkkEk?rel=0" frameborder="0" allowfullscreen></iframe> |
| 42 | +Emotional facial recognition combined with movement/placemnet recogntion and hand finger tracking - where our AI aware avatar responds. |
33 | 43 | <iframe width="450" height="230" src="https://www.youtube.com/embed/I-sZEyvtsXk?rel=0" frameborder="0" allowfullscreen></iframe>
|
34 | 44 |
|
35 | 45 |
|
36 | 46 | <div id="paperSection"></div>
|
37 | 47 |
|
38 | 48 |
|
39 | 49 | <br><br>
|
40 |
| -**------ PAPERS: bioBrainVR ------** |
| 50 | +**------ PAPERS: Sensing Humans (Bio/Brain/Face/Movement/VR) ------** |
41 | 51 |
|
42 | 52 |
|
43 | 53 | {% for publi in site.data.publist2 %}
|
|
0 commit comments