Mocap Fusion [ VR ]
Motion Capture Fusion [VR] is an immersive roomscale mocap sandbox for artists and animators who wish to create and export motion capture animations, or create live content, using conventional VR hardware. With as little as a single VR HMD and two controllers users may create mocap on their own avatars. Advanced users may create more detailed motion capture including full body tracking and can be combined with other sensors (eg. Apple IPhone Truedepth sensor and Oculus Quest 2 optical finger tracking) to connect avatars to simultaneous inputs. This fusion of multiple sensor can combine many layers of motion capture in a single take; including: full body tracking, face capture, lipsync, gaze tracking, optical finger tracking.
Highlights
Record motion capture using custom avatars, scenes and props.
Add custom shaders, textures, emotes expressions, dynamicbone physics and more.
Vive pro eye, Vive lip tracking and Iphone ARkit facecap support.
HTC Vive, Valve Index, Oculus Rift, Oculus Quest2, some WMR headsets.
Can be used without VR, trackers only, or with only IPhone head tracking.
Add multiple avatars into a scene, build storyboards, react to other prerecorded avatars.
Camera motion capture and zoom, player acts as the cinematographer in VR.
VTOL and Fixed Wing flight simulation vehicle platforms for aerial photography shots.
Supported Output Formats
Exports to SFM (.dmx).
Exports to Blender (.blend).
Exports to Unity (.anim).
Live-Link (live production) and Live-Recording plugins
Live Link plugin available for Unreal Engine (live avatar sync, live recording).
Live Link plugin available for Blender (live avatar sync, live recording).
Live Link plugin available for Unity (live avatar sync).
Include custom avatars and use the avatar throughout the production workflow. This eliminates the need for retargeting and ensures the mocap data always fits 1:1 without causing any offsets in the final results.
One of the unique features of Mocap Fusion is that it has the ability to export motion capture data and reconstruct the scene in Blender, making it available for final rendering in minutes.
Compatible Headsets (VR HMDs)
Valve Index
HTC Vive (and Vive Pro Eye).
Oculus Quest (1 and 2).
Optional Tracking Hardware
SteamVR Vive trackers.
IPhone Truedepth sensor (facecap and eye tracking).
Oculus Quest 2 (full optical finger tracking).
Capabilities
Export mocap and create scenes in Blender™ instantly.
HTC™ Vive Trackers (Up to 11 optional points) full body tracking.
Ability to record, playback, pause, slomo, scrub mocap in VR.
Customizable IK profiles and avatar parameters.
SteamVR Knuckles support for individual finger articulation.
Quest 2 optical finger tracking app for individual finger articulation and finger separation.
Vive Pro Eye blink and gaze tracking support.
Sidekick IOS Face capture app (Truedepth markerless AR facial tracking).
User customizable Worlds, Avatar and Props may be built for mocap using the APS_SDK.
Compatible with existing Unity3D™ avatars and environments.
Supports custom shaders on mocap avatars.
DynamicBone support for adding hair, clothing and body physics simulation to avatars.
Breathing simulation for added chest animation.
Add/Record/Export VR Cameras for realistic camera mocap (eg. VR Cameraman effect).
Optimization for exporting mocap (.bvh) data to Daz 3D.
Placement of "streaming" cameras for livestreaming avatars to OBS or as desktop overlays.
Microphone audio recording with lip-sync visemes and recordable jaw bone rotation.
Storyboard mode, save mocap experiences as pages for replaying or editing later.
Animatic video player, display stories and scripts, choreograph movement.
Dual-handed weapon IK solvers for natural handling of carbines.
Recordable VTOL platform for animating helicopter flight simulation (eg. news choppers).
VR Camcorders and VR selfie cams may be rigidly linked to trackers.
VR props and firearms may be rigidly linked to trackers.
Ghost curves for visualizing the future locations of multiple avatars in a scene.
Gameplay
The experience depends on the user's PC and the tracking hardware used. The recommended SteamVR headsets are the Valve Index or the HTC Vive. A Quest HMD may also produce reasonable results. It is also possible to use the software without an HMD (eg. when livestreaming). Full body tracking is only available when using feet and hip trackers (and optional elbows, knees, chest). Users may achieve more realistic tracking results when using body trackers. Body trackers are optional and standing mocap is supported. Further realism my be achieved on compatible avatars by also enabling face capture or using a Vive Pro Eye for gaze and blink tracking.
History
Originally this was designed as an intuitive way for users to create virtual training videos and presentation in an immersive VR environment for added realism and then export their animation for rendering. The project was made available to a community for beta testing and since has received feedback and many feature requests which has helped add to the utility of the software for a verity of different creators.
Highlights
Record motion capture using custom avatars, scenes and props.
Add custom shaders, textures, emotes expressions, dynamicbone physics and more.
Vive pro eye, Vive lip tracking and Iphone ARkit facecap support.
HTC Vive, Valve Index, Oculus Rift, Oculus Quest2, some WMR headsets.
Can be used without VR, trackers only, or with only IPhone head tracking.
Add multiple avatars into a scene, build storyboards, react to other prerecorded avatars.
Camera motion capture and zoom, player acts as the cinematographer in VR.
VTOL and Fixed Wing flight simulation vehicle platforms for aerial photography shots.
Supported Output Formats
Exports to SFM (.dmx).
Exports to Blender (.blend).
Exports to Unity (.anim).
Live-Link (live production) and Live-Recording plugins
Live Link plugin available for Unreal Engine (live avatar sync, live recording).
Live Link plugin available for Blender (live avatar sync, live recording).
Live Link plugin available for Unity (live avatar sync).
Include custom avatars and use the avatar throughout the production workflow. This eliminates the need for retargeting and ensures the mocap data always fits 1:1 without causing any offsets in the final results.
One of the unique features of Mocap Fusion is that it has the ability to export motion capture data and reconstruct the scene in Blender, making it available for final rendering in minutes.
Compatible Headsets (VR HMDs)
Valve Index
HTC Vive (and Vive Pro Eye).
Oculus Quest (1 and 2).
Optional Tracking Hardware
SteamVR Vive trackers.
IPhone Truedepth sensor (facecap and eye tracking).
Oculus Quest 2 (full optical finger tracking).
Capabilities
Export mocap and create scenes in Blender™ instantly.
HTC™ Vive Trackers (Up to 11 optional points) full body tracking.
Ability to record, playback, pause, slomo, scrub mocap in VR.
Customizable IK profiles and avatar parameters.
SteamVR Knuckles support for individual finger articulation.
Quest 2 optical finger tracking app for individual finger articulation and finger separation.
Vive Pro Eye blink and gaze tracking support.
Sidekick IOS Face capture app (Truedepth markerless AR facial tracking).
User customizable Worlds, Avatar and Props may be built for mocap using the APS_SDK.
Compatible with existing Unity3D™ avatars and environments.
Supports custom shaders on mocap avatars.
DynamicBone support for adding hair, clothing and body physics simulation to avatars.
Breathing simulation for added chest animation.
Add/Record/Export VR Cameras for realistic camera mocap (eg. VR Cameraman effect).
Optimization for exporting mocap (.bvh) data to Daz 3D.
Placement of "streaming" cameras for livestreaming avatars to OBS or as desktop overlays.
Microphone audio recording with lip-sync visemes and recordable jaw bone rotation.
Storyboard mode, save mocap experiences as pages for replaying or editing later.
Animatic video player, display stories and scripts, choreograph movement.
Dual-handed weapon IK solvers for natural handling of carbines.
Recordable VTOL platform for animating helicopter flight simulation (eg. news choppers).
VR Camcorders and VR selfie cams may be rigidly linked to trackers.
VR props and firearms may be rigidly linked to trackers.
Ghost curves for visualizing the future locations of multiple avatars in a scene.
Gameplay
The experience depends on the user's PC and the tracking hardware used. The recommended SteamVR headsets are the Valve Index or the HTC Vive. A Quest HMD may also produce reasonable results. It is also possible to use the software without an HMD (eg. when livestreaming). Full body tracking is only available when using feet and hip trackers (and optional elbows, knees, chest). Users may achieve more realistic tracking results when using body trackers. Body trackers are optional and standing mocap is supported. Further realism my be achieved on compatible avatars by also enabling face capture or using a Vive Pro Eye for gaze and blink tracking.
History
Originally this was designed as an intuitive way for users to create virtual training videos and presentation in an immersive VR environment for added realism and then export their animation for rendering. The project was made available to a community for beta testing and since has received feedback and many feature requests which has helped add to the utility of the software for a verity of different creators.
Available on devices:
- Windows