oculus rift get position of touch controllers?

The question is pretty much in the title there. How can we access the touch controller positions. I just want to make my touch controllers follow the rendered touch controller objects. After that Im gonna replace the touch controllers with hands. Ive built hands in blender3d and can use those hands in Unity3d. Im using the transform.rotation.localeulerangles to move the joints/bones of the hands and trying to figure out how to make the collision with objects better. I want to move this project to ab3d engine now. When we are going to have the Bepu physics engine or Jitter physics engine demo incorporated in ab3d dxengine its gonna be even more awesome to interact with objects. In the meantime if its possible i just want the position of the controllers.

thank you.

EDIT: nvm problem fixed... Windows 10 was lagging and my oculus rift tracker stands werent functionning properly. So in both programs unity3d and ab3d my oculus touch werent following the hand movements at all.
I am glad that this is already fixed.

If you grab the latest version of Oculus Wrap sample (https://github.com/ab4d/Ab3d.OculusWrap) and check the Ab3d.DXEngine.OculusWrap.Sample, you will see that there is already rendering of Oculus 3D controllers that are fully transformed as you move your actual controllers in the hands.

The next step is to add rendering of 3D hands. I already have the models, but I would like to add bone animations to that so that it would be possible to show different hand poses. I already have a working version of bone animations, but it needs to be polished a little bit. When this will be ready I will update the Oculus sample.

I have also checked the Jitter Physics Engine and find it very interesting. I have added a new todo to crate a sample with using the engine.

It should be quite easy to use it with WPF 3D. If you check the RenderAll in the sample that come with Jitter Physics Engine you see that it creates boxes based on the calculate position and orientation. The orientation is 3x3 matrix - so from those two values you can create a 4x4 matrix - you put the orientation values into M11 ... M33 and Position into M41 ... M43 (or OffsetX ... Offsetz). This way you can create a WPF's Transform (can be used for transforming individual objects - for example BoxVisual3D objects from Ab3d.PowerToys).
Andrej Benedik

Forum Jump:

Users browsing this thread:
1 Guest(s)