NOKOV
Website
XingYing4.1-EN
XingYing4.1-EN
  • XINGYING4.0
    • Beginner's Guide
      • Hardware Setup
      • Software Quick Start
    • Ⅰ. System Overview
      • 1. NOKOV Motion Capture System
      • 2. Hardware Components
    • Ⅱ. System and Camera Setup
      • 1. Camera Models
      • 2. Hardware Assembly
      • 3. Camera Connection
      • 4. Focusing and Aperture
    • Ⅲ. Software Installation and Launch
      • 1. Network Settings
      • 2. Firewall Settings
      • 3. Software Installation
      • 4. Software Launch
      • 5. Software Activation
      • 6.Lens configuration
    • Ⅳ. Getting Started with XINGYING 4.0
      • 1. Menu Bar
      • 2. Toolbar
    • Ⅴ. Calibration
      • 1. Calibration Settings
      • 2. Calibration Preparation
      • 3. Zoning Calibration
      • 4. T Calibration Progress Prompt
      • 5. Reference Video Calibration
    • Ⅵ. Views
      • 1. Device Properties
      • 2. Assets Properties
      • 3. Data Casting
    • Ⅶ. Software Settings
      • 1. Recording Settings
      • 2. View Settingss
      • 3. AI Settings
      • 4. Preferences
      • 5. Other Settings
    • Ⅷ. Calibration Categories
      • 1. Preparing for Calibration
      • 2. Basic Calibration
      • 3. Updating Calibration
      • 4. Origin Point Calibration
      • 5. Partial Calibration
      • 6.Ground Calibration
      • 7. Calibration-free (Anchor Calibration)
      • 8. side-mounted Camera / inverted Camera
    • Ⅸ. Markerset Creation
      • 1. Rigid
      • 2. Body
      • 3. Custom template
      • 4. Agent
      • 5. Asset Library
      • 6.Probe
      • 7.Scene library
    • Ⅹ. Data Collection and Fast Playback
      • 1. Data Collection
      • 2. Fast Playback
    • Ⅺ. Data Analysis
      • 1. Analyze Graphs
      • 2. Skeleton Graphs
      • 3. Analog Graphs
      • 4. Force Plate Data
      • 5. Health Graphs
      • 6. VMarker Graphs
    • Ⅻ. Data Processing
      • 1. Importing Data
      • 2. Create MarkerSet (Rigid)
      • 3. Use Existing MarkerSet
      • 4. Data Restoration
      • 5. Template Creation
      • 6. Template Identification and Tracking
      • 7. Data Rectification
      • 8. Use MarkerSet in Real-Time
      • 9. Marker Editing
      • 10.Marker switching
      • 11.Multi-frame recognition
    • XIII. Data Export
      • 1. Ly Files
      • 2. XRS Files
      • 3. TRC Files
      • 4. BVH Files
      • 5. HTR Files
      • 6. FBX Files
      • 7. Dynamics Files
      • 8. C3D Files
      • 9. ANC Files
      • 10. Exporting Forces Files
      • 11. Packing motion capture data
    • XIV. Human Body Template
      • 1. 53 Markers and Skeleton Definitions
      • 2. Helen Hayes Model
      • 3. Gloved Setup for 53-Marker Template
      • 4.CGM2 Body Model
      • 5.Finger Marker Point Instructions
      • 6. Explanation of 59-Point Finger Attachment
      • 7. Explanation of 63-Point Finger Attachment
      • 8.PlugIn Gait Marker Placement Instructions
    • XV. Data Retargeting
      • 1. Importing Model Assets
    • XVI. Third-Party Integrations
      • Force Plates
        • Bertec
        • KUNWEI
        • AMTI
        • Kistler
        • Aili Intelligent
        • General configuration
          • Import/Export Force Plate Config Files
          • Force and Torque
          • Other Configurations
      • EMG
        • ZHIYUNWEI
        • Distributed Wireless Electromyography
        • Delsys
        • Cometa
        • Noraxon
      • High-Speed Industrial Cameras
        • Standard industrial camera
        • Alkeria
        • HUIDIAN
        • DAHENG
        • DITECT
        • Network camera
        • FLIR Industrial Camera
      • Inertial Gloves
        • Glove management
        • Manus
        • VRTRIX
        • Diver-X
          • Installation and Calibration Configuration
          • Creating a Human Body Model with Gloves
    • XVII. Plugins & SDKs
      • 1. Unreal Engine (UE) Plugin
      • 2. Guide to Connecting Devices and Integrating VRPN Data with UE
      • 3. Use VRPN Data to Drive Rigid Body in UE
      • 4. MotionBuilder Plugin
      • 5. Maya Plugin
      • 6. Unity Plugin
      • 7. IClone Plugin
      • 8. Application of TimeCode in UE
      • 9.C4D Plugin
      • 10.Blender plugin
      • 11.Open VR Plugin
      • 12. Communication between Ros2 and Nokov
    • XVIII. UDP Capture Broadcast/Triggering
      • 1. Remote Control
      • 2. Remote Trigger
    • XVIIII.Log
      • 1.DisplayLog
Powered by GitBook
On this page
  • Software Setup and Human Model Creation
  • Plugin Installation and UE Settings
  • Introducing Human Data with Skeletal Redirection into UE
  • Blueprint Class
  • LiveLink Rigid Body Application
  • Solutions for UE Engine Refresh Issues
  1. XINGYING4.0
  2. XVII. Plugins & SDKs

1. Unreal Engine (UE) Plugin

PreviousXVII. Plugins & SDKsNext2. Guide to Connecting Devices and Integrating VRPN Data with UE

Last updated 6 months ago

Software Setup and Human Model Creation

  1. Open the Data Casting Pane and enable "SDK" in the software.

  2. The XINGYING software has completed calibration. Create a 53-point human model in real-time mode, or load data with a 53-point human model in post-processing, and play it in the software.


Plugin Installation and UE Settings

  1. Depending on the version of UE used, download the corresponding NOKOVLiveLink plugin. The installation and usage methods are consistent. For example, if you are using the UE_4.26 engine, please unzip the "NOKOVLiveLink1.XXX" plugin version for the UE_4.26 engine and copy the entire plugin folder and paste it into the UE_4.26 engine path. The specific path is "Epic games\UE_4.26\Engine\Plugins", and it will depend on the software in your computer (17.1.1).

  2. After copying the plugin folder for version 4.26 to the corresponding location, open the UE_4.26 engine and create a project. Go to "Edit—Plugins", check "Animation—Live Link" and "Motion Capture—NOKOVLiveLink" (17.1.2, 17.1.3), and restart the software to activate the plugin. The steps are the same when using UE_4.27 or UE_5.0 or 5.1 engines;

  3. In UE_4.26, select "Window—Live Link" and click on "Source—NOKOV Live Link." The Server IP should be consistent with the network card address set in the data broadcast panel of the XingYing software. The default is 10.1.1.198. Ensure the Up Axis is consistent with the setup in XINGYING, and click OK. Play XINGYING to receive motion capture data, and a green indicator light will illuminate (17.1.4).

  4. Create an Animation Blueprint by locating a skeleton with a "Skeletal Mesh" in the menu bar (17.1.5). Right-click on this Skeletal Mesh, select Create, then Animation Blueprint (17.1.6). Double-click to open the newly created Animation Blueprint. Right-click in the Animation Blueprint, search for "Live Link Pose, " and double-click to open it (17.1.7).

  5. In the "Live Link Pose, " select the name of the target MarkerSet in "Live Link Subject Name" to drive the motion of the model in UE4 with that MarkerSet. Connect "Live Link Pose" to "Output Pose" by dragging (17.1.8). On the right side, select Retarget, then Retarget Asset, and choose NOKOVLiveLinkRetarget Asset (17.1.9).

  6. Ensure that the human MarkerSet in XINGYING software is in a T-pose position. Click the Compile button in UE4. When the human MarkerSet in XINGYING starts to move, the model in UE4.26 will be driven and move synchronously.

  7. The operation steps for UE_4.27, UE_5.0, UE_5.1, UE_5.2, UE_5.3 engines are the same as for UE_4.26. The difference is that for the UE_4.27 engine, you choose Window -> LiveLink (17.1.10), and for UE_5.0 and later versions, you choose Window -> Virtual Production -> LiveLink (17.1.11).


Introducing Human Data with Skeletal Redirection into UE

  1. Data from XingYing can be directly integrated into UE to drive the human model through a plugin. However, when there is a significant difference between the model and the body's motion, driving the model directly may lead to issues like sliding steps. In such cases, you can bind and redirect through MotionBuilder, then drive the human model in UE via MotionBuilder's UE plugin (MotionBuilder LiveLink), which can be downloaded from the UE official website or obtained by consulting NOKOV engineers. Alternatively, you can import the human model into XingYing and enable the redirection function. After the data is transformed, the output can bypass MotionBuilder and directly drive the model in UE. For specific operation steps on driving the model using XingYing's redirection function, refer to section "Fifteen, Data Redirection".

  2. First, create a human model in Live mode in XINGYING software or load human data in Edit Mode. Use the model asset import feature to import an FBX or HTR model file and perform skeletal redirection. For detailed operation steps, please refer to the section "Fifteen, Data Redirection" above;

  3. After redirecting the human skeleton in XINGYING, enable "Redirected Data" in the settings. Open UE, import the model file with skeletal redirection done in XINGYING into UE, right-click in an empty space, select Blueprint Class (17.1.12), input "NOKOV" in the pop-up window's text box, select "NOKOVLiveLinkRetargetAsset", and click Select (17.1.13). Name the Blueprint Class, if not renamed, the class will default to the name "NewBlueprint". You will see that a Blueprint Class with a cube icon has been successfully created in the content side menu.

  4. Double-click on the Blueprint Class you just created, and you will see two options for "Self Adaptation" and "Use Translation" at the bottom of the popup window. The "Use Translation" option is checked by default, and the "Self Adaptation" option is unchecked by default (17.1.14). If you are not using skeletal redirection to drive the UE model, check the "Self Adaptation" option and uncheck the "Use Translation" option, click Save, and then Compile. If you are using skeletal redirection, ensure the "Use Translation" option is checked and the "Self Adaptation" option is unchecked; click Save, then Compile. Driving the UE model with redirected human data and checking the "Self Adaptation" option may cause abnormalities in the model's skeleton in UE.

  5. To drive the UE model using the skeletal redirection feature, and after unchecking the "Self Adaptation" option in the Blueprint Class, you can refer to steps 5-8 above to drive the model. The only difference is that you need to select the name of the Blueprint Class with the cube icon (that has the "Use Translation" option checked) from the "Retarget—Retarget Asset" drop-down menu on the right side of the window (17.1.15). After selecting it, click Compile. When XINGYING software plays and the MarkerSet starts to move, the model in UE will be synchronously driven.

  6. If you are not using skeletal redirection to drive the UE model, in the Blueprint Class check the "Self Adaptation" option, uncheck the "Use Translation" option, and select the name of the Blueprint Class with the cube icon (that has the "Self Adaptation" option checked) from the "Retarget—Retarget Asset" drop-down menu. After selecting, click Compile. When the XINGYING software plays and the MarkerSet starts moving, the model in UE will be synchronously driven.

  7. When driving the UE model with redirected human data, ensure that the "Use Redirected Data" feature in XINGYING is enabled. If it is not, enable it, then disconnect and re-establish the LiveLink connection between UE and XINGYING. Failure to do so may cause inconsistencies between the human data and the SDK's information, leading to abnormalities in the UE model's skeleton.


Blueprint Class

  1. Double-click the created Blueprint Class to open it (17.1.16).

  2. The NOKOV Skeleton Mapping table displays XINGYING's human skeleton names on the left and the UE model skeleton names on the right. To edit the list, select the "Enable skeletonMapping" checkbox in the NOKOV Skeleton Mapping option. Otherwise, the list will be uneditable. Each joint skeleton name has a checkbox on its left; if checked, the joint skeleton will participate in data driving. If unchecked, it will not.

  3. Click the Select button to the right of an individual joint (row), then enter the skeleton name in the search box to perform a fuzzy match for the target joint. For example, if you select the "Spine1" skeleton and click the Select button, you can enter "Spine" in the search box to perform a fuzzy match for the target joint (17.1.18).

  4. Click "Select a skeleton for mapping" at the bottom of the window to display all the skeleton resources (17.1.19). Select a skeleton resource to apply it and check if its skeleton names correspond correctly with those of XINGYING's human model. If there's a mismatch, use the Select button to match the target joints via fuzzy search.

  5. If you are using motion capture data to drive the UE model and the model's skeleton displays abnormalities or incorrect positions, this may be due to some skeleton names being inconsistent with those in the XINGYING software, leading to abnormal skeletons in UE. In this case, check the mapping of the skeleton names. Double-click on the created Blueprint Class, and check the skeleton joints on both sides in the NOKOV skeleton mapping list to ensure they correspond correctly. If there's a mismatch, make the necessary changes, save, and click Compile. Then, select the name of the modified Blueprint Class in the Retarget—Retarget Asset drop-down menu and click Compile to successfully drive the model's movement.

  6. To see the skeleton names and hierarchy of the model file imported into UE, double-click on the model, click on the skeleton icon in the model window, and you will see a tree diagram on the right side showing the model's skeleton names (17.1.20).

  7. When selecting a part of the model's skeleton from the skeleton name tree diagram, the model will display the specific location of that skeleton on its body. This method is used to verify that the UE skeleton names correspond correctly with the XINGYING human model skeleton names. For example, select the UE model's forearm skeleton and check in the NOKOV skeleton Mapping list if the UE forearm skeleton name corresponds correctly with the XINGYING human body forearm skeleton. If you find that the UE forearm skeleton is incorrectly mapped to the XINGYING body upper arm skeleton, copy the name of the forearm skeleton from the UE model and map it to the forearm skeleton of the XINGYING body in skeleton Mapping. Do this for each skeleton to ensure all names match correctly. After completing the checks, click "Save." You can view the XINGYING human model skeleton positions and their names by clicking the "Joints" list in the XINGYING assets panel.

  8. If the skeletons are correctly mapped between XINGYING and the UE model but the model still shows abnormalities in UE, please contact our technical engineers for assistance.


LiveLink Rigid Body Application

  1. To use the UE rigid body application, first place an object with reflective markers in the motion capture area. In Live mode, use the one-click create rigid body function to create a Markerset. After successfully creating the rigid body, open the Soft Data Broadcast panel and enable "Enable SDK."

  2. Open the UE project file. If creating the UE project for the first time, please contact our technical engineers to obtain the corresponding plugin version and refer to "1) UE Plugin" above to correctly configure our plugin.

  3. Click Window — Virtual Production — LiveLink in UE. In the LiveLink window, click to add a source and select "NOKOV LiveLink." Ensure the IP address matches that of the XINGYING software, which defaults to 10.1.1.198, and that the axis directions are consistent, which defaults to Y-up (17.1.21).

  4. In the dropdown list for NOKOV Server IP, all the network card addresses of the local machine will be displayed. You can also click "Remote Server IP" and enter the IP address in the "NOKOV Remote IP" input field.

  5. After completing the settings, click OK and play the XINGYING software. You will see a green light indicating that it is active and displaying the names of all connected Markersets. A character name of "Transform" means that the MarkerSet is a rigid body, while a name of "Animation" indicates that the MarkerSet is a human (17.1.22).

  6. Next, quickly create a static mesh in UE and add it to your project by selecting a geometric shape (17.1.23).

  7. After creation, you can see the instance you just created in the "Details" property bar on the right, and the geometry sphere has been added to the view (17.1.24). Click the add button, enter "LiveLink, " and select the "Live Link Controller" component (17.1.25).

  8. After adding the LiveLinkComponentController component, select it. In the dropdown menu on the right side of "Subject Representation, " choose the name of the rigid body connected earlier in the LiveLink tab. Click on "Tracker0" (17.1.26) to use the rigid body named "Tracker0" to drive the geometry sphere added in UE.\

  9. Select the root component of the default scene and change its mobility to movable (17.1.27).

  10. Next, play the XINGYING software. As the "Tracker0" rigid body moves, the sphere in UE will move in synchronization with it.


Solutions for UE Engine Refresh Issues

  • When using the XINGYING plugin to drive a model in UE, if XINGYING loses the human body and re-recognition does not display the model in the UE scene, this issue is caused by a refresh problem in the UE engine. To resolve this refresh issue, after linking the Live Link Pose to the Output Pose to drive the model in the Animation Blueprint, you also need to add nodes in the event graph of the Animation Blueprint and connect them (17.1.28). Doing this can solve the problem of the UE engine not refreshing.

17.1.1
17.1.2
17.1.3
17.1.4
17.1.5
17.1.6
17.1.7
17.1.8
17.1.9
17.1.10
17.1.11
17.1.12
17.1.13
17.1.14
17.1.15
17.1.16
17.1.17
17.1.18
17.1.19
17.1.20
17.1.21
17.1.22
17.1.23
17.1.24
17.1.25
17.1.26
17.1.27
17.1.28