Mocap with Neuron Sense Suit Put on the Neuron Sense Suit. Every part of it provide a graph it is. Chips on the foot, wires toward up, and chips on the hand, wires go reversely. After put on the suit, insert the tiny accelerometer chips into the slot.
after calibration and recording takes.
1. Click auto label.
2. Check if there are unlabeled data with can flabel it first, and then, we can work on missing data by generate data to fill the gap.
3. When generating data, for interpolation setting, using linear to generate straight line, cubic for curve line, pattern-based for generate data from near marker. The last method to apply is model base which predict the position of the marker by referring whole body.
4. After fill the gap in the data, the coverage rate of labeled marker should be very high. However, we still have to play back whole motion takes to check if there’s something weird, because auto labeled data may be wrong. The program may mistakenly binding the tracker to wrong label when the tracker overlap with each other. So here we need to use swap to fix it.
5. Smooth unsmooth curves to make the movement look more stable and slicky.
6.Export motion takes into .fbx Binary format files with 30fps framerate, .
#1. Do pair recording when recording motion take. It helps a lot as a reference when cleaning data.
#2. When recording the motion takes, it’s better to start and end with T pose.
Rigging In Maya
1. Switch to rigging mode in the drop down menu on the left up side.
2. In the skeleton> quick rig, we can use auto rigging or step-by-step rigging rigging.
3. If using step-by-step rigging. First we add the current mesh into geometry. Then we set embedded method of guide into polygon soup and create the guide. The generated guide may be far away from you expected, so we have to manually adjust those joint point to right position. Next step is generate skeleton from guide. If the skeleton isn’t good, delete it and adjust the guide. At this point, the skeleton isn’t connect to the mesh itself. At last, we do skinning(bounding skeleton with mesh).
4. After saving the file, we export the model with skeleton to add up motion data. file> send to Motionbuilder and Motionbuilder will be open.
1. First we can click the display > Xray on the upper left that we can see skeleton clearly.
2. Scene> modeName_Reference> modeName_Hips> right click> select Branches and then modeName_Hips> right click> zero> rotate
3. drag the character preset on to modeName_Hips and it will generate a character in the character, rename it.
4. Import motion data, zero out and create character base on this data
5. Double click modeName_Hips character, choose input type: character, driven by input source: data, check active. At this time, your model should be able to move as the motion data.
6. Before plot to character, there are actually a lots of things we can adjust to fit different physical size and height of a real human actor.
7. At last, plot to character> control rig > FK/IK. At this point, the animated model is itself. Those motion data and character from it are no longer needed. We can delete it and save the file ( by each takes if needed).
1. First we setup the file hierarchy to better manage our asset. We need a asset folder contain Blueprint, static meshes and skeleton meshes.
2. Import the animation file. We first import mesh and skeleton and then import animation belong to the skeleton.
3. Now those asset are ready for use, just draw into the scene.
After calibrating the camera, first thing we have to do is put on tracking suit. A tracking suit consist of shoes cover, gloves, pant and jacket. There are some precautions for wearing this suit.
Tuck your jacket into pants.
The close must be totally tight to your body. If the close is too loose, the reflection spot will move and produce dirty data.
Then, we can start to stick the marker onto the dancer. We can open the create panel and select the skeleton tab on the right side of the window. There will be a 3D body figure reference about where should we put the markers.
The lower marker on the back must align to the mid point of higher marker.
Markers must be place right on the moving joint. And the tips of foot.
After locate the markers, the dancer have to do a T pose and we have to make sure Motive app is tracking total amounts of markers we put on it. After everything are done, we just click “create”.
Sync Optitrack System with Unreal Engine Calibrate Camera in Motive Application
-Mask or move out all the reflective objects in the studio with the map
-Click Calibration Panel on the right side and start wanding until every camera have 10k+ sample
-After wanding, click Calculate button and then Apply button the result.
-Put the ground plane on the floor pointing to the opposite of the screen, and click set the ground plane button in the ground tab.
Setup Rigid Body and Recording Motion Data
-Select the points of a rigid body and right click select “create rigid body”
-Click recording to record motion data or click live to streaming data
Sending Data to Unreal
-Find out the IP address of the PC in terminal which run Motive application
-Create the Optitrack-client in the Unreal, put it in the center of the scene and input the IP of the tracking PC
Add Motion Data to Move GameObject
-Create Optitrack objectset, set both this and the object we want to control as movable.
-Make Optitrack objectset as a parent object(set location of child object 0 in xyz)
-Select the tracking ID(starting from 1)
Click play in the Unreal Engine
Stream or play the motion data from the tracking PC