NOKOV
Website
XingYing4.1-EN
XingYing4.1-EN
  • XINGYING4.0
    • Beginner's Guide
      • Hardware Setup
      • Software Quick Start
    • Ⅰ. System Overview
      • 1. NOKOV Motion Capture System
      • 2. Hardware Components
    • Ⅱ. System and Camera Setup
      • 1. Camera Models
      • 2. Hardware Assembly
      • 3. Camera Connection
      • 4. Focusing and Aperture
    • Ⅲ. Software Installation and Launch
      • 1. Network Settings
      • 2. Firewall Settings
      • 3. Software Installation
      • 4. Software Launch
      • 5. Software Activation
      • 6.Lens configuration
    • Ⅳ. Getting Started with XINGYING 4.0
      • 1. Menu Bar
      • 2. Toolbar
    • Ⅴ. Calibration
      • 1. Calibration Settings
      • 2. Calibration Preparation
      • 3. Zoning Calibration
      • 4. T Calibration Progress Prompt
      • 5. Reference Video Calibration
    • Ⅵ. Views
      • 1. Device Properties
      • 2. Assets Properties
      • 3. Data Casting
    • Ⅶ. Software Settings
      • 1. Recording Settings
      • 2. View Settingss
      • 3. AI Settings
      • 4. Preferences
      • 5. Other Settings
    • Ⅷ. Calibration Categories
      • 1. Preparing for Calibration
      • 2. Basic Calibration
      • 3. Updating Calibration
      • 4. Origin Point Calibration
      • 5. Partial Calibration
      • 6.Ground Calibration
      • 7. Calibration-free (Anchor Calibration)
      • 8. side-mounted Camera / inverted Camera
    • Ⅸ. Markerset Creation
      • 1. Rigid
      • 2. Body
      • 3. Custom template
      • 4. Agent
      • 5. Asset Library
      • 6.Probe
      • 7.Scene library
    • Ⅹ. Data Collection and Fast Playback
      • 1. Data Collection
      • 2. Fast Playback
    • Ⅺ. Data Analysis
      • 1. Analyze Graphs
      • 2. Skeleton Graphs
      • 3. Analog Graphs
      • 4. Force Plate Data
      • 5. Health Graphs
      • 6. VMarker Graphs
    • Ⅻ. Data Processing
      • 1. Importing Data
      • 2. Create MarkerSet (Rigid)
      • 3. Use Existing MarkerSet
      • 4. Data Restoration
      • 5. Template Creation
      • 6. Template Identification and Tracking
      • 7. Data Rectification
      • 8. Use MarkerSet in Real-Time
      • 9. Marker Editing
      • 10.Marker switching
      • 11.Multi-frame recognition
    • XIII. Data Export
      • 1. Ly Files
      • 2. XRS Files
      • 3. TRC Files
      • 4. BVH Files
      • 5. HTR Files
      • 6. FBX Files
      • 7. Dynamics Files
      • 8. C3D Files
      • 9. ANC Files
      • 10. Exporting Forces Files
      • 11. Packing motion capture data
    • XIV. Human Body Template
      • 1. 53 Markers and Skeleton Definitions
      • 2. Helen Hayes Model
      • 3. Gloved Setup for 53-Marker Template
      • 4.CGM2 Body Model
      • 5.Finger Marker Point Instructions
      • 6. Explanation of 59-Point Finger Attachment
      • 7. Explanation of 63-Point Finger Attachment
      • 8.PlugIn Gait Marker Placement Instructions
    • XV. Data Retargeting
      • 1. Importing Model Assets
    • XVI. Third-Party Integrations
      • Force Plates
        • Bertec
        • KUNWEI
        • AMTI
        • Kistler
        • Aili Intelligent
        • General configuration
          • Import/Export Force Plate Config Files
          • Force and Torque
          • Other Configurations
      • EMG
        • ZHIYUNWEI
        • Distributed Wireless Electromyography
        • Delsys
        • Cometa
        • Noraxon
      • High-Speed Industrial Cameras
        • Standard industrial camera
        • Alkeria
        • HUIDIAN
        • DAHENG
        • DITECT
        • Network camera
        • FLIR Industrial Camera
      • Inertial Gloves
        • Glove management
        • Manus
        • VRTRIX
        • Diver-X
          • Installation and Calibration Configuration
          • Creating a Human Body Model with Gloves
    • XVII. Plugins & SDKs
      • 1. Unreal Engine (UE) Plugin
      • 2. Guide to Connecting Devices and Integrating VRPN Data with UE
      • 3. Use VRPN Data to Drive Rigid Body in UE
      • 4. MotionBuilder Plugin
      • 5. Maya Plugin
      • 6. Unity Plugin
      • 7. IClone Plugin
      • 8. Application of TimeCode in UE
      • 9.C4D Plugin
      • 10.Blender plugin
      • 11.Open VR Plugin
      • 12. Communication between Ros2 and Nokov
    • XVIII. UDP Capture Broadcast/Triggering
      • 1. Remote Control
      • 2. Remote Trigger
    • XVIIII.Log
      • 1.DisplayLog
Powered by GitBook
On this page
  1. XINGYING4.0
  2. Ⅵ. Views

3. Data Casting

Previous2. Assets PropertiesNextⅦ. Software Settings

SDK Broadcasting Data (6.3.1)

  1. Click on 'Data Casting' at the top of the software interface. The default network card address for XINGYING software is '10.1.1.198'. The IP address can also be changed in the drop-down box.

  2. XINGYING software supports SDK secondary development, allowing data broadcasting for usage on the same machine or other computers within the same IP segment. The IP address can be changed live through the drop-down box in 'Network Card Address'. The set network card address is the address from which the server sends data. To receive data from motion capture, the client computer must remain in the same network segment as the server.

  3. The software supports dynamic IP acquisition, eliminating the need to restart. It can work in tandem with C++/C# and other plugins to acquire motion capture data in real time. Please consult our technical engineers to obtain the plugin version.

  4. SDK Streaming: The 'SDK Enable' option is not selected by default. This feature can be selected when the camera is not connected or is paused. After selecting SDK, the software will save this configuration, and the SDK will be selected by default when the software is subsequently started. After enabling SDK, motion capture data will be broadcast externally through the set network card IP address using the SDK protocol. The software supports 'Unicast' and 'Multicast' modes below, with the default setting being 'Multicast' mode.

  5. After selecting the SDK option, the lower left corner of the 3D view in Live mode can display the delay time and unit in real-time (6.3.2). If the SDK function is turned off, the delay time information will not be displayed. The SDK delay time will not be displayed in real-time in Edit Mode.

  6. Skeleton Coordinates: The default skeleton coordinates are set to "Global" (Section 6.3.1), and can also be set to either "Global" or "Local" from a dropdown menu. After setting the skeleton coordinates, this setting is saved in the configuration, and the next time XINGYING is launched, the type of skeleton coordinates set previously will be displayed:

    • Global: The motion capture data sent by the SDK to the outside is in the form of global skeleton coordinates, and the skeleton data obtained by the client is global skeleton coordinates data.

    • Local: The motion capture data sent by the SDK to the outside is in the form of local skeleton coordinates, and the skeleton data obtained by the client is local skeleton coordinates data.

    • If the SDK sends data related to rigid bodies, the type of skeleton data broadcasted externally, whether global or local, is consistent and indistinguishable. If the data sent is human-type, there would be some differences in the skeleton data broadcasted externally between global and local skeleton coordinates;

    • To change the type of skeleton coordinates, the "SDK" must be shut down before any modifications can be made; otherwise, the skeleton coordinates function will be greyed out and unchangeable.


VRPN Streaming (6.3.3)

  1. Please pause the software or enable VRPN in 'Data Casting' when the camera is not connected before turning on VRPN. After enabling VRPN, motion capture data will be broadcasted externally through the set network card IP address using the VRPN protocol.

  2. VRPN can be divided into three types: 'Rigid', 'Marker', and 'Marker (Unnamed)'. Both Live mode and Edit Mode can use VRPN to transmit data. Before using VRPN, you can use our 'NOKOVVrpnClient.exe' test tool to verify whether VRPN data can be obtained. You can also use open-source code or tools to obtain the VRPN data in our motion capture. Below is a brief introduction on how to use the NOKOVVrpnClient test tool to obtain different types of data. (Please consult the technical engineer if you need this test tool.)

  3. Checking the 'Rigid' type indicates sending information data about the rigid body. The naming method in the test tool is 'Rigid Body Name@10.1.1.198'. Check 'Rigid' and 'VRPN Enable', and after checking, click play to put the camera into play mode. Use the cmd command to enter the path where the NOKOVVrpnClient.exe file is located in the terminal, input the command '.NOKOVVrpnClient.exe Tracker0@10.1.1.198' and press enter (6.3.4). You can obtain the rigid body data in motion capture, where 'Tracker0' corresponds to the name of the rigid body in the motion capture software.

  4. The naming method for the human body is 'MarkerSet Name_Skeleton Name'. For example, to get the information of the human head skeleton, input the command '.NOKOVVrpnClient.exe Body0_SHead@10.1.1.198' and press enter, you can then obtain the head skeleton data of the human body named Body0. 'SHead' represents the name of the human head skeleton. If you want to get information on other human skeletons, you only need to replace 'SHead' with the corresponding skeleton name.

  5. The 'Marker' type indicates the transmission of data about named Marker points. The naming method in the test tool is 'Rigid Body Name_Marker Name @10.1.1.198'. After selecting 'Marker' and 'VRPN', click 'play' to put the camera into play mode. Use the command prompt to navigate to the directory where the NOKOVVrpnClient.exe file is located. For example, to get the data of the point 'Marker1' belonging to the rigid body 'Tracker0', you can run the command '.NOKOVVrpnClient.exe Tracker0_Marker1@10.1.1.198' and press enter (6.3.5), then you can get the information data of the point 'Marker1', where 'Tracker0' corresponds to the name of the rigid body in the software, and 'Marker1' represents the name of the point belonging to the rigid body 'Tracker0'. The specific names of the points of the rigid body can be viewed in the Assets - Component - Markers list.

  6. The 'Marker(Unnamed)' type refers to the transmission of data about unnamed Marker points. The naming method is 'U_Tracker+Unnamed Marker Index (starting from 0) @10.1.1.198'. Select 'Marker(Unnamed)' and 'VRPN Enable', then click 'play' to put the camera into play mode. Use the cmd command to enter the path where the NOKOVVrpnClient.exe file is located in the terminal, input the command '.NOKOVVrpnClient.exe U_Tracker0@10.1.1.198' and press enter (6.3.6), you can then obtain the information data of the unnamed Marker points. Here, 'U_Tarcker0' represents the first unnamed point. If you want to get the information data of the second unnamed point, change 'U_Tarcker0' to 'U_Tracker1', which represents the second unnamed point.

  7. The 'Unit' option box allows you to select different units of measurement. You can choose from 'millimeter', 'centimeter', and 'meter'. After changing the unit, the data in VRPN will adjust accordingly.

  8. In 'Invert', checking the x, y, z coordinates will reverse the sign of the variables selected in 'Type'. qx, qy, qz' represent the rotational data of the rigid body. After selecting 'Rigid Body' and running '.NOKOVVrpnClient.exe Tracker0@10.1.1.198', 'quat' will display the rotational data of the rigid body 'Tracker0'. After selecting 'qx, qy, qz', the coordinate values of 'quat' will be inverted. The 'quat' value only displays the rotation of the rigid body and does not display the rotation information of variables 'Marker' and 'Marker(Unnamed)' .

  9. In 'Offset', you can set the 'x, y, z' coordinate offsets for the data variables. After adjusting the offset values, VRPN will add these modified values to the original coordinates.

  10. When 'Velocity' and 'Acceleration' are checked, they will output corresponding data in VRPN. 'Frames' is used for speed calculation; adjusting the frame factor can align the actual speed value with the output speed value.

network streaming:

  1. Compatible with the Xsens MVN protocol, can be used with software that supports the Xsens MVN protocol. Xingying data can be sent to Dassault Systemes' Delmia and Catia software through Haption software to drive human models and props. Data can also be sent to BOB software to drive human models.

6.3.1
6.3.2
6.3.3
6.3.4
6.3.5
6.3.6