I am not satisfied at all with kinect sdk, so that is why I am using your software, which is faaaar better, and realtime. Congrats on getting this accuracy.
I managed to calibrate a rgb camera with one of the cameras used in ipisoft. Right now I can stream the data and update a 3d model with it (including the root position, which I understand puts it in the correct world coordinates).
So right now what I am left is getting the pose of the kinect in world coordinates (ipisoft mocap). I see that I can get them from the Scene tab when I load an iPiMotion file, but I need to get this information while on a Live session. How can I get that? And what is the order of transforms to go from pan/tilt/roll to a rotation matrix?
|