NOKOV
Website
XingYing3.4-EN
XingYing3.4-EN
  • XINGYING3.4
    • Beginner's Guide
      • Hardware Setup
      • Software Quick Start
    • Ⅰ. System Overview
      • 1. NOKOV Motion Capture System
      • 2. Hardware Components
    • Ⅱ. System and Camera Setup
      • 1. Camera Models
      • 2. Hardware Assembly
      • 3. Camera Connection
      • 4. Focusing and Aperture
    • Ⅲ. Software Installation and Launch
      • 1. Network Settings
      • 2. Firewall Settings
      • 3. Software Installation
      • 4. Software Launch
    • Ⅳ. Getting Started with XINGYING 3.0
      • 1. Menu Bar
      • 2. Toolbar
    • Ⅴ. Calibration
      • 1. Calibration Settings
      • 2. Calibration Preparation
    • Ⅵ. Views
      • 1. Device Properties
      • 2. Assets Properties
      • 3. Data Casting
    • Ⅶ. Software Settings
      • 1. Recording Settings
      • 2. View Settingss
      • 3. AI Settings
      • 4. Preferences
      • 5. Other Settings
    • Ⅷ. Calibration Categories
      • 1. Preparing for Calibration
      • 2. Basic Calibration
      • 3. Updating Calibration
      • 4. Origin Point Calibration
      • 5. Partial Calibration
      • 6.Ground Calibration
      • 7. Calibration-free (Anchor Calibration)
    • Ⅸ. Markerset Creation
      • 1. Rigid
      • 2. Body
      • 3. Custom template
      • 4. Agent
      • 5. Asset Library
      • 6.Probe
      • 7.Scene library
    • Ⅹ. Data Collection and Fast Playback
      • 1. Data Collection
      • 2. Fast Playback
    • Ⅺ. Data Analysis
      • 1. Analyze Graphs
      • 2. Skeleton Graphs
      • 3. Analog Graphs
      • 4. Force Plate Data
      • 5. Health Graphs
      • 6. VMarker Graphs
    • Ⅻ. Data Processing
      • 1. Importing Data
      • 2. Create MarkerSet (Rigid)
      • 3. Use Existing MarkerSet
      • 4. Data Restoration
      • 5. Template Creation
      • 6. Template Identification and Tracking
      • 7. Data Rectification
      • 8. Use MarkerSet in Real-Time
      • 9. Marker Editing
      • 10.Marker switching
      • 11.Multi-frame recognition
    • XIII. Data Export
      • 1. Ly Files
      • 2. XRS Files
      • 3. TRC Files
      • 4. BVH Files
      • 5. HTR Files
      • 6. FBX Files
      • 7. Dynamics Files
      • 8. C3D Files
      • 9. ANC Files
      • 10. Exporting Forces Files
      • 11. Packing motion capture data
    • XIV. Human Body Template
      • 1. 53 Markers and Skeleton Definitions
      • 2. Helen Hayes Model
      • 3. Gloved Setup for 53-Marker Template
      • 4.CGM2 Body Model
      • 5.Finger Marker Point Instructions
    • XV. Data Retargeting
      • 1. Importing Model Assets
    • XVI. Third-Party Integrations
      • Force Plates
        • Bertec
        • KUNWEI
        • AMTI
        • Kistler
        • Aili Intelligent
        • General configuration
          • Import/Export Force Plate Config Files
          • Force and Torque
          • Other Configurations
      • EMG
        • ZHIYUNWEI
        • Distributed Wireless Electromyography
        • Delsys
        • Cometa
        • Noraxon
      • High-Speed Industrial Cameras
        • Standard industrial camera
        • Alkeria
        • HUIDIAN
        • DAHENG
        • DITECT
        • Network camera
        • Instructions for Hikvision Cameras
      • Inertial Gloves
        • Glove management
        • Manus
        • VRTRIX
    • XVII. Plugins & SDKs
      • 1. Unreal Engine (UE) Plugin
      • 2. Guide to Connecting Devices and Integrating VRPN Data with UE
      • 3. Use VRPN Data to Drive Rigid Body in UE
      • 4. MotionBuilder Plugin
      • 5. Maya Plugin
      • 6. Unity Plugin
      • 7. IClone Plugin
      • 8. Application of TimeCode in UE
      • 9.C4D Plugin
      • 10.Blender plugin
    • XVIII. UDP Capture Broadcast/Triggering
      • 1. Remote Control
      • 2. Remote Trigger
    • XVIIII.Log
      • 1.DisplayLog
Powered by GitBook
On this page
  • Importing Plugins
  • Unity settings and usage
  • Display of Named and Unnamed Points
  • Access VRPN Handle in Unity
  1. XINGYING3.4
  2. XVII. Plugins & SDKs

6. Unity Plugin

Previous5. Maya PluginNext7. IClone Plugin

Importing Plugins

  1. Extract the plugin installation package for 'XINGYING_Unity_Plugin_XXXX. unitypackage', create a Unity project, click on Assets (17.6.1), and select 'Import package - CustomimPackage...', Import the extracted Unity plugin (17.6.2);

  2. The first time you import, there may be red compilation errors in the log panel. To fix this, open the project properties and set it to allow unsafe code. Check Edit > Project Settings > Player and enable "Allow Unsafe Code" (17.6.3);


Unity settings and usage

  1. Select "Assets > NOKOV > Scenes > NOKOVExample" in the imported installation package and double-click to open it (17.6.4);

  2. Click on "Client NOKOV" in the Hierarchy panel, modify the Host IP on the right to 10.1.1.198, and select "XINGYING" for Skeleton Naming Convention (17.6.5, 17.6.6). If the XINGYING software is playing glove human body data, please select "Xing Ying With Hand" for Skeleton Naming Convention.

  3. Click on "Sapphiart@walk" in the scene. Ensure that the three modules in the Inspector are checked, and that the name of the "Skeleton Asset Name" in "NOKOV Skeleton Animation (Script)" matches the name of the human body in XINGYING (17.6.7, 17.6.8). After making modifications, press Enter to save.

  4. Ensure that XINGYING software is in playback mode, then click the playback button in Unity to let the human body in XINGYING move. The model in Unity will then perform synchronous motion (17.6.9).

  5. Switch to the Scene view to observe the model from different angles (17.6.10).\


Display of Named and Unnamed Points

  1. In Unity, click on "Client-NOKOV" and check the "Draw Unlabeled Markers" checkbox (17.6.11). Then click the "Play" button at the top of the Unity software, and unnamed points will be displayed in the Unity scene.

  2. Check the "Draw Markers" checkbox (17.6.12), then click the "Play" button, and the named points of the human body will be displayed in the Unity scene.

  3. After expanding "Client-NOKOV," the list will show the IDs of both named and unnamed points. After selecting the ID name of a named or unnamed point from the list, you can view the position of the named or unnamed point in the Unity scene, and the XYZ coordinates of the selected point will be displayed on the right (17.6.13).

  4. Dropdown to Select Scale Ratio, Dropdown to Select Up Axis: Click on "Client-NOKOV," and on the right side, click on "Length Unit" to expand the dropdown and select the scale ratio (17.6.14). You can choose among "meter, Centimeter, Millimeter" for scaling objects in the Unity scene, with the default unit being "Meter." Click on the "Up Axis" option to expand the dropdown box where you can choose the up axis for the Unity scene, with the options being "Y, Z." The default up axis is "Y."


Access VRPN Handle in Unity

  1. First, follow the instructions in "17. Handle Connection Instructions and VRPN Data Access UE Instructions" to set up the connection and dynamic capture of the handle.

  2. Attach reflective markers to the object to be captured and create a rigid body in Live mode using XINGYING software.

  3. Open Unity and create a new project.

  4. Drag the plugin into the "Assets" folder (17.6.11), click "Import" to import it, then double-click "Demo" in the "Assets - UVRPN - Sensors" directory to enter the Unity scene (17.6.12) and switch to "Sense" (17.6.13).

  5. Parameter settings:

    • Set the local address: In "VRPN-Manager, " set the "Host: (IP/localhost)" address to "127.0.0.1" (17.6.14). If connecting to another host, change the IP address to the server IP address of the other host. After setting the IP, press Enter and Ctrl+S to save.

    • Turn on logging: Enabling logging allows you to see detailed information about the pressed buttons on the handle. Printing the error log can help identify and resolve issues. Select 'Window > General > Console' and check 'Debug Log' (17.6.15).

    • Set Flystick: Click on "Flystick" and enter the name of the rigid body created by VR Tracker software in the "Tracker" input box in the "VRPN_Tracker (Script)" module. Set "Channel" to 0 and press Enter to save (17.6.16).

    • In the "Tracker" input box of the "VRPN_Button (Script)" module, enter the name of the handle (joystick1, joystick2), check "Debug log" to facilitate printing of log information (17.6.17). If the first handle is connected, enter "joystick1" and press the Enter key. The "Channel" value here represents the number of the corresponding handle key position.

    • Enter the name of the handle (joystick1, joystick2) in the "Tracker" input box of the "VRPN_Analog (Script)" module, and set "Channel" to 0 (17.6.18).

  6. Real-time Drive Tracker

    • After all the above configurations are set, set the "Channel" value in the "VRPN_Button (Script)" module to 1, press the Enter key to save the configuration, click Play to switch to "Sense" to display the scene, and after moving the rigid body, the objects in Unity will also be synchronously driven. At this time, press the button numbered "1" on the handle, and the "Console" in Unity will print the information of the handle key position, Button 1 Hold "indicates that button 1 of the handle is pressed, and" Button 1 Up "indicates that button 1 of the handle is released after being pressed (17.6.19).

    • If you need to use other buttons on the handle, set the "Channel" value to the corresponding number in the "VRPN_Button (Script)" module.

    • If a handle wheel is used, there is no need to set it in the "Channel". Simply slide the wheel and the "Analog" coordinate value in the "VRPN_Analog (Script)" module will change in real time (17.6.20);

    • Inversion of displacement and rotation: When using a rigid body to drive objects in a Unity scene in Live, the displacement and rotation of the rigid body data can be reversed. In the "VRPN_Tracker (Script)" module, "Position Tracking" represents the displacement of the rigid body, "RotationTracking" represents the rotation of the rigid body, and checking "X, Y, Z" represents the inversion of the coordinate values of the "X, Y, Z" axes (17.6.21).

17.6.1
17.6.2
17.6.3
17.6.4
17.6.5
17.6.6
17.6.7
17.6.8
17.6.9
17.6.10
17.6.11
17.6.12
17.6.13
17.6.14
17.6.11
17.6.12
17.6.13
17.6.14
17.6.15
17.6.16
17.6.17
17.6.18
17.6.19
17.6.20
17.6.21