cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Oculus Full Body Test using Kinect

Matt1Up
Level 2



That little white dot you see represents the Kinect. I programmed this using Touch Designer.

What you are seeing is a direct output of the actual kinect scan. It is ran through a glsl shader so there is hardly any lag at all. I already have the actual fingertips tracked as well. I will have an awesome demo done soon.


I have a perfectly matched 3D model of my living room that I made that lines up perfectly with the real world view coming from the Kinects RGB camera so I am going to be able to walk around my living room.....while wearing the Oculus

Still got a little bit of work to do with the character rigging but this shows you the virtual living room





And some virtual glowsticks would be pretty cool with the Oculus I think. You could also incorporate this as some sort of weapon or control system within a game. ๐Ÿ˜‰


6 REPLIES 6

huttybangbang
Level 2
Augmented virtual reality? Looks cool. Would be cool to check out the demo ๐Ÿ˜„

Btw, did you grab the image of your room from the Kinect or was it hand created using a 3D application?
Oculus Hut - A big fat slice of VR A page I made about surfing the virtual reality web! Janus VR Room 45: janus://zion.vrsites.com/1/45 YouTube: https://www.youtube.com/channel/UCj70pq7hA26fvLv832NmboA A little UE4 experience: Virtual Bard

Matt1Up
Level 2
I hand created it. I make the main models using Sketchup and then import them into Touch Designer as an FBX where I take care of all the lighting and textures.

DeadlyJoe
Level 8
Very cool demo. I would love to see a demonstration where you're controlling the avatar, but in a first-person perspective, replacing your body with the avatar. This is going to be one of the key technologies needed to make many first-person perspective VR games truly immersive.

Guspaz
Level 2
If you put a few Kinect sensors around the room, you could theoretically combine the 3D and texture data in real-time and build a full live environment that updates as things change. The problem I see would be that even if you had enough Kinect sensors to fuse the data into a decently complete environment, the resolution of the depth data is going to be so low that everything will be really... funky.

EDIT: Also, neat stuff, OP ๐Ÿ™‚

Matt1Up
Level 2
I am able to do that but no matter what it is never gonna be as realistic as just seeing your actual body (as far hands and control goes anyways imo)

One of the things I was thinking about trying to do is to take the Kinect scan data that is coming in and implementing some kind of ribbed/slices setup that would attach to a stick model of your skeleton (based on the Kinect skeleton data of course) and somehow measure the curvature/extents of your bodies shape so that it could be replaced with geometry and wrapped with a characters texture. This would allow you to maintain your actual body shape and could adjust for each player

Matt1Up
Level 2
I have had two Kinect sensors setup before but you run into problems because the infrared lasers start interfering with one another. The only way you can even come close is to have them at a 90 degree angle in relation to one another