Kinetix Documentation
Website
  • 🖐️Welcome
  • 🕺About Kinetix
  • 🖌️User-Generated Emote
    • 🤖Video-to-animation AI
    • 🤸Kinetix Emote Standards
    • 🦴Retargeting
    • 🏂Kinanim (Emote streaming)
    • 🖊️Try the User-Generated Emote feature
  • INTEGRATION
    • 🤷SDK or API: what to choose
    • 🤩Kinetix SDK
      • ⚙️SDK - Core Package
      • 🧩SDK - Sample scene
      • 🧑‍💻Tested & approved Game Engines for the SDK
      • 💻SDK Integration in Unity
        • ⬇️Set-up Unity & Import the SDK
          • 🔑Get your Game authentification API Key
          • ⚙️Setup your Unity environment
          • 🔧Install the Kinetix SDK in Unity
          • ⌨️SDK Initialization - Unity
        • ⚡Quickstart - Unity SDK
        • 🦋SDK Core Modules activation - Unity
          • 🔐Account Management in Unity
          • 💃Animation System - Unity
            • 🎢Unity's Animator System
            • 🎨Custom Animation System
            • 🌳Root Motion
              • RootMotion + NavMeshAgent
            • 🎞️Frame Controller
            • 💪IK Controller
            • 🫥Avatar Mask
          • 📶Network - Unity SDK
            • Photon PUN
            • Photon Fusion
        • 🕺User-Generated Emote integration (Unity)
          • Embedding the PWA in your Unity application
          • Validation / Retake
        • 🚧How to update Kinetix SDK in Unity?
        • 📕SDK API Reference - Unity
          • KinetixCore
          • KinetixCore.Account
          • KinetixCore.Animation
          • KinetixCore.Metadata
          • KinetixCore.Network
          • KinetixCore.UGC
        • 📂Unity SDK Changelog
      • 🕹️SDK Integration in Unreal Engine
        • ⬇️Set-up Unreal Engine & Download the SDK
          • 🔑Get your Game authentification API Key
          • ⚙️Set up your Unreal Engine environment
          • 🔧Install the Kinetix SDK in Unreal
          • ⌨️UE SDK Core Package Initialization
        • ⚡Quick Start
        • 🦋SDK Core Modules activation - Unreal
          • 🔐Account Management - UE
          • 💃Animation System - UE
            • Local player system in Unreal Engine
            • Avatar system in Unreal Engine
            • Without uploading an avatar (deprecated)
              • Animation in a Third Person Template
              • Animation in an existing project
          • 📶Network - UE SDK
        • 🕺User-Generated Emote integration (UE)
        • 📕SDK API Reference - UE
          • KinetixCore
          • KinetixCore.Account
          • KinetixCore.Animation
          • KinetixCore.Metadata
          • KinetixCore.Network
        • ⬆️Updating from 0.3 to 0.4
        • 📂UE SDK Changelog
    • 😍Kinetix API
      • 🔑Get your Authentification API key
      • 🔌API routes
      • 🪝API Webhooks
      • 🏓Possible Return Codes
  • Management
    • 🚪Developer Portal
      • 👽Avatar Upload
      • 👮UGC Moderation
    • 🖌️UGE: Guides & Best Practices
      • 📐User-Generated Emote specifications
      • 👌Video recording best practices
      • 👻User-Generate Emote Use Cases
  • SUPPORT
    • 📬Bugs reports & features requests
    • ❓FAQ
      • FAQ: General questions
      • FAQ: Unity SDK
      • FAQ: Unreal Engine SDK
    • 🤕Troubleshooting
    • 📚Glossary of terms
Powered by GitBook
On this page
  • Video input specifications
  • AI body recognition capabilities
  • AI processing capabilities
  • Emote output specifications

Was this helpful?

Export as PDF
  1. Management
  2. UGE: Guides & Best Practices

User-Generated Emote specifications

Learn more about the prerequisites to leverage the User-Generated Emote feature.

Video input specifications

  • Video maximum duration: 10 seconds

A video trim is included to squeeze video duration down to 10 sec maximum

  • No video resolution limitations - 1080p is preferred

Please, not that there is no added value to input files with a higher resolution than 1080p, as the pipe will reprocess the format to 1080p. For example: 4k resolution will add up to 2 min processing time to reduce the file size.

  • Compatible with any video format (MP4, AVI, MOV, MKV,...).

  • Real-life footages only. Footages from video games or animation films are not supported for the moment. It may happen that some non-real-life footages be detected by our AI and produce good outputs, but it is very uncertain. Kinetix strongly recommend not to upload cartoon / unrealistic footages.

AI body recognition capabilities

  • Full body movement: supported.

  • Finger / hand movements: supported. At the moment, Kinetix AI is spotting if the fists are clenched, hands and fingers are opened or fingers are detached (celebrating victories ✌️for example). More complex hands movements are still hard to detect but Kinetix keeps improving its models at large scale.

  • Facial expressions: not supported for the moment - we are working on it!

  • Half body: not supported. The outputs for videos capturing only half-body movements are unpredictable. We strongly recommend using videos where the entire body is visible to ensure reliable and accurate results.

  • Multiple actors: not supported. Optimal results are achieved with videos that include 1 actor. We strongly recommend using videos where only one body is visible to ensure reliable and accurate results.

If your video includes multiple performers, we recommend to crop it so that the actor you want to extract the movement appears alone or in the forefront of the video. Learn more: Video recording best practices If multiple persons are included in the video, the machine learning process exports the longest animation detected.

AI processing capabilities

Estimated Waiting Time: average uploading + processing time for a 5-10 sec video = 5 min.

Multiple processing: the Kinetix AI can process up to 200 videos simultaneously.

Emote output specifications

  • Emote directly accessible in-game and attributed to the user that created it.

  • Everything mentioned inKinetix Emote Standards will be applied with our emotization script.

PreviousUGE: Guides & Best PracticesNextVideo recording best practices

Last updated 1 year ago

Was this helpful?

🖌️
📐