NOKOV
Website
XingYing4.1-EN
XingYing4.1-EN
  • XINGYING4.0
    • Beginner's Guide
      • Hardware Setup
      • Software Quick Start
    • Ⅰ. System Overview
      • 1. NOKOV Motion Capture System
      • 2. Hardware Components
    • Ⅱ. System and Camera Setup
      • 1. Camera Models
      • 2. Hardware Assembly
      • 3. Camera Connection
      • 4. Focusing and Aperture
    • Ⅲ. Software Installation and Launch
      • 1. Network Settings
      • 2. Firewall Settings
      • 3. Software Installation
      • 4. Software Launch
      • 5. Software Activation
      • 6.Lens configuration
    • Ⅳ. Getting Started with XINGYING 4.0
      • 1. Menu Bar
      • 2. Toolbar
    • Ⅴ. Calibration
      • 1. Calibration Settings
      • 2. Calibration Preparation
      • 3. Zoning Calibration
      • 4. T Calibration Progress Prompt
      • 5. Reference Video Calibration
    • Ⅵ. Views
      • 1. Device Properties
      • 2. Assets Properties
      • 3. Data Casting
    • Ⅶ. Software Settings
      • 1. Recording Settings
      • 2. View Settingss
      • 3. AI Settings
      • 4. Preferences
      • 5. Other Settings
    • Ⅷ. Calibration Categories
      • 1. Preparing for Calibration
      • 2. Basic Calibration
      • 3. Updating Calibration
      • 4. Origin Point Calibration
      • 5. Partial Calibration
      • 6.Ground Calibration
      • 7. Calibration-free (Anchor Calibration)
      • 8. side-mounted Camera / inverted Camera
    • Ⅸ. Markerset Creation
      • 1. Rigid
      • 2. Body
      • 3. Custom template
      • 4. Agent
      • 5. Asset Library
      • 6.Probe
      • 7.Scene library
    • Ⅹ. Data Collection and Fast Playback
      • 1. Data Collection
      • 2. Fast Playback
    • Ⅺ. Data Analysis
      • 1. Analyze Graphs
      • 2. Skeleton Graphs
      • 3. Analog Graphs
      • 4. Force Plate Data
      • 5. Health Graphs
      • 6. VMarker Graphs
    • Ⅻ. Data Processing
      • 1. Importing Data
      • 2. Create MarkerSet (Rigid)
      • 3. Use Existing MarkerSet
      • 4. Data Restoration
      • 5. Template Creation
      • 6. Template Identification and Tracking
      • 7. Data Rectification
      • 8. Use MarkerSet in Real-Time
      • 9. Marker Editing
      • 10.Marker switching
      • 11.Multi-frame recognition
    • XIII. Data Export
      • 1. Ly Files
      • 2. XRS Files
      • 3. TRC Files
      • 4. BVH Files
      • 5. HTR Files
      • 6. FBX Files
      • 7. Dynamics Files
      • 8. C3D Files
      • 9. ANC Files
      • 10. Exporting Forces Files
      • 11. Packing motion capture data
    • XIV. Human Body Template
      • 1. 53 Markers and Skeleton Definitions
      • 2. Helen Hayes Model
      • 3. Gloved Setup for 53-Marker Template
      • 4.CGM2 Body Model
      • 5.Finger Marker Point Instructions
      • 6. Explanation of 59-Point Finger Attachment
      • 7. Explanation of 63-Point Finger Attachment
      • 8.PlugIn Gait Marker Placement Instructions
    • XV. Data Retargeting
      • 1. Importing Model Assets
    • XVI. Third-Party Integrations
      • Force Plates
        • Bertec
        • KUNWEI
        • AMTI
        • Kistler
        • Aili Intelligent
        • General configuration
          • Import/Export Force Plate Config Files
          • Force and Torque
          • Other Configurations
      • EMG
        • ZHIYUNWEI
        • Distributed Wireless Electromyography
        • Delsys
        • Cometa
        • Noraxon
      • High-Speed Industrial Cameras
        • Standard industrial camera
        • Alkeria
        • HUIDIAN
        • DAHENG
        • DITECT
        • Network camera
        • FLIR Industrial Camera
      • Inertial Gloves
        • Glove management
        • Manus
        • VRTRIX
        • Diver-X
          • Installation and Calibration Configuration
          • Creating a Human Body Model with Gloves
    • XVII. Plugins & SDKs
      • 1. Unreal Engine (UE) Plugin
      • 2. Guide to Connecting Devices and Integrating VRPN Data with UE
      • 3. Use VRPN Data to Drive Rigid Body in UE
      • 4. MotionBuilder Plugin
      • 5. Maya Plugin
      • 6. Unity Plugin
      • 7. IClone Plugin
      • 8. Application of TimeCode in UE
      • 9.C4D Plugin
      • 10.Blender plugin
      • 11.Open VR Plugin
      • 12. Communication between Ros2 and Nokov
    • XVIII. UDP Capture Broadcast/Triggering
      • 1. Remote Control
      • 2. Remote Trigger
    • XVIIII.Log
      • 1.DisplayLog
Powered by GitBook
On this page
  • Local Connection
  • Local Area Network (LAN)
  • Precautions
  1. XINGYING4.0
  2. Ⅸ. Markerset Creation

4. Agent

Previous3. Custom templateNext5. Asset Library
  1. The XINGYING software includes built-in intelligent agent models for "drones" and "unmanned vehicles." After creating a rigid body in real-time mode or loading .cap data during post-processing, an intelligent agent can be bound to the specified rigid body. The intelligent agent's dashboard includes the "main dashboard, " "speedometer, " "pitch and roll angle dashboard, " and "yaw angle dashboard" (9.4.1). Once a rigid body is bound to the intelligent agent's dashboard, the timing in the chart begins, and all gauges start operating to display detailed data.

  2. Here are two methods for connecting drones to the XINGYING system, along with the necessary precautions:


Local Connection

  1. First, configure the data forwarding port of the drone. After the port is set, power on the drone.

  2. Insert network cable A into the LAN1 port of the router (9.4.2) and connect the other end to the switch where the motion capture cameras are located. Insert network cable B into the LAN2 port of the router and connect the other end to the network port on the motion capture computer. If the computer does not have enough network ports, you can use a USB Ethernet adapter as an extension and insert network cable B into the adapter.

  3. Open XINGYING, navigate to software settings, select "Agent Settings, " and set the listening port (9.4.3). Ensure the listening port matches the data forwarding port set on the drone.

  4. Connect the cameras and play back the footage, then create a rigid body. At this point, you can bind an intelligent agent model (see section 9.4.4).

  5. Open the intelligent agent dashboard (see section 9.4.5), bind the created rigid body, select the device number (if there are multiple drones, multiple numbers will be displayed in the dropdown list), choose the type as a drone, and a drone icon will be displayed in the middle of the yaw angle dashboard.

  6. At this point, the various gauges in the intelligent agent window will display data (see section 9.4.6). The top right corner will show the real-time battery level of the drone. In Edit Mode, the intelligent agent dashboard will not display device numbers or battery levels. You can resize the window by dragging the button at the bottom right corner of the dashboard.


Local Area Network (LAN)

  1. First, configure the data forwarding port of the drone, and then power it on.

  2. To connect the drone via LAN, ensure that the drone's IP address and the motion capture computer's IP address are on the same subnet. If not, configure the drone's IP address accordingly, and set the data forwarding port for the drone.

  3. On the motion capture computer, open a terminal and enter ssh pi@192.168.2.253 to access the onboard computer of the drone; "192.168.2.253" is the IP address of the drone.

  4. Open another terminal and input roslaunch mavros px4.launch gcs_url:="udp://@192.168.2.124:14550". "192.168.2.124" is the IP address of the motion capture computer, and "14550" is the drone data listening port set in the XINGYING software, which should match the data forwarding port set on the drone (9.4.7).

  5. Open XINGYING, go to software settings, select "Agent Settings, " and set the listening port. Ensure the listening port matches the data forwarding port set on the drone. Turn on the ground control station switch.

  6. Connect the cameras and play the footage, create a rigid body, open an agent window, and bind the created rigid body. Select the device number and specify the type. Various gauges in the agent window will begin displaying data, with the real-time battery level of the drone shown in the top-right corner.


Precautions

When switching between live and edit modes, all intelligent agent dashboards in the software interface will be cleared. To restart the agent view, reconfigure the options.

Use the intelligent agent function in an environment where the calibration axis (coordinate system origin upward axis) is the Z-axis.

If the device number is not displayed, please check the following:

When using a LAN, verify that the IP addresses of the drone and the motion capture computer are on the same subnet.

Check if the XINGYING ground station is turned on.

Confirm that the receiving port in XINGYING matches the data forwarding port of the drone.

Perform a self-check to ensure the drone is working properly.

9.4.1
9.4.2
9.4.3
9.4.4
9.4.5
9.4.6
9.4.7