NOKOV
Website
XingYing3.4-EN
XingYing3.4-EN
  • XINGYING3.4
    • Beginner's Guide
      • Hardware Setup
      • Software Quick Start
    • Ⅰ. System Overview
      • 1. NOKOV Motion Capture System
      • 2. Hardware Components
    • Ⅱ. System and Camera Setup
      • 1. Camera Models
      • 2. Hardware Assembly
      • 3. Camera Connection
      • 4. Focusing and Aperture
    • Ⅲ. Software Installation and Launch
      • 1. Network Settings
      • 2. Firewall Settings
      • 3. Software Installation
      • 4. Software Launch
    • Ⅳ. Getting Started with XINGYING 3.0
      • 1. Menu Bar
      • 2. Toolbar
    • Ⅴ. Calibration
      • 1. Calibration Settings
      • 2. Calibration Preparation
    • Ⅵ. Views
      • 1. Device Properties
      • 2. Assets Properties
      • 3. Data Casting
    • Ⅶ. Software Settings
      • 1. Recording Settings
      • 2. View Settingss
      • 3. AI Settings
      • 4. Preferences
      • 5. Other Settings
    • Ⅷ. Calibration Categories
      • 1. Preparing for Calibration
      • 2. Basic Calibration
      • 3. Updating Calibration
      • 4. Origin Point Calibration
      • 5. Partial Calibration
      • 6.Ground Calibration
      • 7. Calibration-free (Anchor Calibration)
    • Ⅸ. Markerset Creation
      • 1. Rigid
      • 2. Body
      • 3. Custom template
      • 4. Agent
      • 5. Asset Library
      • 6.Probe
      • 7.Scene library
    • Ⅹ. Data Collection and Fast Playback
      • 1. Data Collection
      • 2. Fast Playback
    • Ⅺ. Data Analysis
      • 1. Analyze Graphs
      • 2. Skeleton Graphs
      • 3. Analog Graphs
      • 4. Force Plate Data
      • 5. Health Graphs
      • 6. VMarker Graphs
    • Ⅻ. Data Processing
      • 1. Importing Data
      • 2. Create MarkerSet (Rigid)
      • 3. Use Existing MarkerSet
      • 4. Data Restoration
      • 5. Template Creation
      • 6. Template Identification and Tracking
      • 7. Data Rectification
      • 8. Use MarkerSet in Real-Time
      • 9. Marker Editing
      • 10.Marker switching
      • 11.Multi-frame recognition
    • XIII. Data Export
      • 1. Ly Files
      • 2. XRS Files
      • 3. TRC Files
      • 4. BVH Files
      • 5. HTR Files
      • 6. FBX Files
      • 7. Dynamics Files
      • 8. C3D Files
      • 9. ANC Files
      • 10. Exporting Forces Files
      • 11. Packing motion capture data
    • XIV. Human Body Template
      • 1. 53 Markers and Skeleton Definitions
      • 2. Helen Hayes Model
      • 3. Gloved Setup for 53-Marker Template
      • 4.CGM2 Body Model
      • 5.Finger Marker Point Instructions
    • XV. Data Retargeting
      • 1. Importing Model Assets
    • XVI. Third-Party Integrations
      • Force Plates
        • Bertec
        • KUNWEI
        • AMTI
        • Kistler
        • Aili Intelligent
        • General configuration
          • Import/Export Force Plate Config Files
          • Force and Torque
          • Other Configurations
      • EMG
        • ZHIYUNWEI
        • Distributed Wireless Electromyography
        • Delsys
        • Cometa
        • Noraxon
      • High-Speed Industrial Cameras
        • Standard industrial camera
        • Alkeria
        • HUIDIAN
        • DAHENG
        • DITECT
        • Network camera
        • Instructions for Hikvision Cameras
      • Inertial Gloves
        • Glove management
        • Manus
        • VRTRIX
    • XVII. Plugins & SDKs
      • 1. Unreal Engine (UE) Plugin
      • 2. Guide to Connecting Devices and Integrating VRPN Data with UE
      • 3. Use VRPN Data to Drive Rigid Body in UE
      • 4. MotionBuilder Plugin
      • 5. Maya Plugin
      • 6. Unity Plugin
      • 7. IClone Plugin
      • 8. Application of TimeCode in UE
      • 9.C4D Plugin
      • 10.Blender plugin
    • XVIII. UDP Capture Broadcast/Triggering
      • 1. Remote Control
      • 2. Remote Trigger
    • XVIIII.Log
      • 1.DisplayLog
Powered by GitBook
On this page
  • Plugin Installation
  • Settings and Usage of Maya
  • Characterization
  • Display of Unnamed Points
  • Server IP
  1. XINGYING3.4
  2. XVII. Plugins & SDKs

5. Maya Plugin

Previous4. MotionBuilder PluginNext6. Unity Plugin

Plugin Installation

  1. Download the latest plugin version: NOKOV MayaPlugin XXX (2018-2022).exe.

  2. Extract the plugin installation package, double-click the extracted plugin, and install it in the software directory (17.5.1). After completion, click Next - Install and wait for the plugin installation to complete. (The same process applies to the glove body in Maya.)


Settings and Usage of Maya

  1. Click on "Window" > "Settings/Preferences" > "Plugin Manager", search for "Plugin" and check the boxes for "PluginMaya.mll" (17.5.2). Hover over the icon to see detailed information.

  2. Close the Plugin Manager, search for "NOKOVPluginWindow" in the MEL window below Maya, and press Enter to load the plugin (17.5.3).

  3. Select the network card address that matches the XINGYING software, click "Offline" to switch to the "Online" state. Check "Refresh UI, " and after starting the motion capture software, Maya will continue to receive data from it (17.5.4).

  4. After the human body in XINGYING software moves, the model in Maya will be driven to perform synchronous motion (17.5.5).

Note:

Since Maya's skeleton nodes do not allow numerical names, the human Markeset name in XINGYING software cannot start with a number or be a pure number. Otherwise, the skeleton will not be displayed in Maya.


Characterization

  1. Taking the post-processing mode of motion capture software as an example, after the Maya plugin receives the human body data broadcasted by XINGYING, you can characterize the human body with one click. In the plugin window, click the "Characterize" button (17.5.6), and in Maya, you can see the characterized name, which follows the naming convention "Characterize_XINGYINGHumanBodyName." After characterization, the human skeleton in Maya will be locked in a T-Pose display. Clicking the "Lock" button on the right side will unlock it, and once unlocked, you will see the human skeleton in Maya moves in sync with the XINGYING human body.

  2. In Maya, you can also manually create characterization for motion capture human body data. After the model assumes a T-pose stance, pause the XINGYING software playback, click on the "Switch Character Control" option in the upper right corner, and then click "Create Character Definition" (17.5.7).

  3. Perform the "Assign Selected Skeleton" operation on each skeleton of the character. After all skeletons are bound, the characterization is successful (17.5.8);

  4. Import the model you want to drive into Maya and repeat the above steps to create a characterization for the imported model. After creation, select character 2 on the right and character 1 as the source to bind the skeleton to the model. After playback, the model will be driven by the character (17.5.9).


Display of Unnamed Points

  • In the NOKOVPluginWindow, check the "UnnamedMarkers" checkbox (17.5.10), and the unnamed points will be displayed in the Maya scene.


Server IP

  • After opening the NOKOVPluginWindow plugin window, click on ServerIP to expand the dropdown menu for IP selection. All IP addresses of the current host will be displayed in the dropdown (17.5.11). Select the IP address that is in the same subnet as the XINGYING broadcast address; the default selection is "10.1.1.198". You can also select "input the remote ip" from the dropdown menu and manually enter the IP address in the "Remote" input field below and check the Online checkbox to connect to the motion capture data.

17.5.1
17.5.2
17.5.3
17.5.4
17.5.5
17.5.6
17.5.7
17.5.8
17.5.9
17.5.10
17.5.11