Change font size
It is currently Sun Dec 17, 2017 2:38 pm


Post a new topicPost a reply Page 1 of 1   [ 5 posts ]
Author Message
PostPosted: Fri Nov 11, 2016 6:53 pm 

Joined: Fri Nov 11, 2016 6:25 pm
Posts: 2
Hi all,
Today is my first day with the 30-day trial using a single Kinect 2.0.
My first test capturing a dance is promising, but the arm tracking is failing when the body rotates.

I'd sincerely appreciate your eyes on my first test:
https://youtu.be/JY5oxu0wnvU

Does rotation that leads to brief arm occlusion generally fail like this?
Is this solved with a second camera, using Move controllers, or something else entirely?
Or is this to be expected and cleaned-up in Maya?

Many thanks for your help!


Top
 Profile  
 
PostPosted: Mon Nov 14, 2016 8:03 am 

Joined: Thu Sep 04, 2014 9:47 am
Posts: 739
Location: Florida USA
...

Pretty common occurrence with that type of motion for occlusion using 1 Kinect, dual Kinects give much better tracking results, but you need 2 computers that will handle the Kinect v2, calibration comes into play, and you will loose capture volume, or need a larger area to record in, more info on this in the iPi Docs also.

In my opinion, Dual Kinects are the way to go for any capturing, if wanting to use Kinects, if you have the minimum area and meet the requirements needed to use Dual or more.

You will most likely never get a perfect tracking without some bad areas that need to be fixed, but the good thing is, iPi Studio allows this to be done fairly easily once you get the hang of it.

Most tracking issues can be cleaned up right in iPi if you stop the tracking right when it looses track, switch to the IK Tool, and reposition the arm/leg and hit the refit button, then continue until tracking is lost again.

If it keeps loosing track, and the tracking process has passed these occlusion areas, you can go to the areas affected and using frame advance with the > key and pull each arm/leg key closer to the arm/leg cloud through the area, (you will of course either need to guess a close trajectory here, or use the video reference to help), then once past the bad areas, let the program take over again hitting track forward again. This can be done with any arm or leg issue such as this.
(NOTE: DO NOT rerun the track forward process on the areas you just manually fixed, it will just mess up and undo what you just manually set).

You can spin the camera to view better for alignment during this step, then hit Camera 1 button again to start tracking from forward view.

In fully occluded areas, using the IK tool to reposition the arm(s)/leg(s) as above, but don't let the tracking run again on those areas, just go on by them after manual repositioning, using the Refine button in later steps should help maintain where you manually positioned those areas, but possibly not in the areas where the cloud data is totally invisible, then only manual re-setting will apply, no auto correct features will work.

For now you don't seem to have any Move controllers, or the dongles (unless maybe you are using a laptop with built-in Bluetooth may work now) but Move controllers are recommended for better final tracking results, but have no effect on the initial tracking quality, the data is added as a second step after tracking is completed, or at least the full area of the ROI take you want is completed.

If you do use Moves, you will most likely need 1 on the head also for better head tracking with Kinect sensors using some kind of device to hold it in place, but you can use manual head positioning later on with certain "frame advancing tricks" in iPi, or another editor also, I am sure there are some YouTube tutorials to assist with this.

Once you have the initial corrections made throughout the ROI, you can then run the Refine feature (Forward or Backward or Both ways) if you do use Moves, you should learn how to Add and position them correctly in scene first (see iPi Docs) and apply their Data before running the Refine feature, as the hand rotations should remain and you will get better Refined results.

Once you get the majority of the capture looking pretty good throughout, you can then fix any further small areas by sliding the ROI to only encompass the smaller bad areas and try to rerun the above steps just on those areas, or if not too bad, you can just move on.

Then run the Jitter Removal and set its parameters first to what you want, if you do use Moves, make applying their Data the final step and should be rerun if any additional manual corrections have been made to the main skeleton after applying their Data before you export the motion.

Use an external 3D editor to complete any offsets needed, and of course depending on the character used with the mocap, you will need to do this anyway.

Hope this helps and maybe some other experienced Kinect users will add their thoughts to further help.

...


Top
 Profile  
 
PostPosted: Mon Nov 14, 2016 8:55 am 

Joined: Mon Aug 03, 2009 1:34 pm
Posts: 2232
Location: Los Angeles
Snapz pretty much covered it above, but that sort of thing error can occasionally happen happen even with multiple devices or camera. Basically, Mocap Studio can't track what it can't see, and in certain poses the body itself can occlude limbs from the software.

Sometimes this is obvious, like in your case, with the torso blocking the 'far' arm. Other times it is less obvious, like when an arm or a leg is fully folded. This shape can confuse the tracker since the software can't always tell which direction the joints are bent when the limb is fully collapsed. The software will try its best to guess but it's not always correct.

Fortunately, this kind of error is among the easiest to correct in Mocap Studio. Just repose the arm or leg to what it should be in the middle of the error, and track backwards from there, and then go back to that keyframe and track forward. After that, use Refine to smooth help smooth it out.

Be careful not to re-track the animation from before the error forward using the regular Track button because chances are that you will re-introduce the error. (This button always tracks based on the depth data; Refine adjusts tracking from the keyed poses.)

Also note that tracking backwards through a difficult region might produce better/more accurate results than tracking forward. That's not a rule but it often works. This is because moving backwards can give the software completely different clues above where the pose is going and where it came from.

When recording, try to keep in mind what the device(s) are seeing. Often times, you can improve the tracking results by simply turning a few degrees away from the camera to make sure it can see what you're doing in those 'trouble spots'. Knowing when to do this simply takes practice and experience. Naturally, you should rehearse your motions and maybe try slight variations to see which performances work best to the devices being used and where you've positioned them.

Try not to think of the sensors/cameras in the same ways as shooting for a movie or a video--the framing has nothing to do with getting 'cinematic' angles or good composition--the data you record needs to be purely useful to the tracker, and only that, and that means recording what it needs to see to get good tracking results.

Hope this helps.

_________________
Greenlaw
Artist/Partner - Little Green Dog | Demo Reel (2017) | Demo Reel (2015) | Demo Reel (2013)

Image
Watch a one minute excerpt on Vimeo now!


Last edited by Greenlaw on Mon Nov 14, 2016 10:27 am, edited 5 times in total.

Top
 Profile  
 
PostPosted: Mon Nov 14, 2016 9:40 am 

Joined: Thu Sep 04, 2014 9:47 am
Posts: 739
Location: Florida USA
...

Yes, good tip on the track backwards can sometimes help, especially with the new algorithm you may need to track backwards a few frames anyway and then track forward, or many times the arm will just fly right off the repositioned area, just simple things you will learn throughout your use of the program.

Another thing you can try on some problem areas, is to switch back to the old algorithm (link is at bottom of tracking dialogue page) for just those trouble spots after adjusting the ROI area to only encompass that area, it will track at a slower frame rate and track each video frame, then you can turn it back off, this works fine for use using PS Eyes, so I would think it would for Kinects also, maybe I can get some further feedback on this from Kinect users.

I wouldn't use the above method in reverse, meaning full track through with old algorithm, then try to switch to new algorithm at any point, you may not get desirable results.

Yes, I didn't mention body self occlusion even when facing fully toward sensors, many times arm motions too close to body will confuse the tracking, or a squat will impede the lower legs view and loose tracking on it, same with arms in certain positions, just something to be aware of to try to limit those effects.

Once you work with it for a while, iPi is one of the most accurate lower-cost consumer mocap products available for a 2D multi-camera/sensor based system.

...


Top
 Profile  
 
PostPosted: Mon Nov 14, 2016 10:16 am 

Joined: Mon Aug 03, 2009 1:34 pm
Posts: 2232
Location: Los Angeles
Oh, and after re-posing, use Refit. This can help the software with its interpolation. Sometimes, clicking Refit a few times can help smooth out a difficult pose for the software.

Obviously, if Refit is making the pose worse, undo it and don't use it for that frame. Most of the time, it's very useful though.

_________________
Greenlaw
Artist/Partner - Little Green Dog | Demo Reel (2017) | Demo Reel (2015) | Demo Reel (2013)

Image
Watch a one minute excerpt on Vimeo now!


Top
 Profile  
 
Display posts from previous:  Sort by  
Post a new topicPost a reply Page 1 of 1   [ 5 posts ]


Who is online

Users browsing this forum: No registered users and 3 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
cron


Powered by phpBB® Forum Software © phpBB Group
610nm Style by Daniel St. Jules of Gamexe.net