Built-in machine learning analysis algorithms isolate and modify common objects in moving footage, dramatically accelerating VFX and compositing workflows. Face matching algorithms are also capable of specialized tasks, including specific mole or scar isolation, through custom layout workflows. Useful for compositing, color grading and cosmetic beauty work, the AI-based face segmentation tool automates all tracking and identifies and isolates facial features - including nose, eyes, mouth, laugh lines and cheekbones - for further manipulation. In response to user requests, the release also adds a new GPU-accelerated Physical Defocus effect and finishing enhancements that make it easier to adjust looks across many shots, share updates with clients and work faster. Flame 2021 increases workflow flexibility for artists, expands AI capabilities with new machine learning-powered human face segmentation and simplifies finishing for streaming services with new functionality for Dolby Vision HDR authoring and display. The close partnerships we have with the likes of Intel, Epic and others will allow us to leverage further enhancements in both tracking and rendering technologies, as well as our own developments around spatial environment data capture and its reuse in non-live environments.”Īutodesk has released Flame 2021 with new features aimed at innovating and accelerating creative workflows for VFX, color grading, look development and editorial finishing.
#Efilm lite mark for export license
Other improvements introduced with the Mk2 include faster and more accurate setup and calibration, which has been further enhanced by the release of a wizard-driven, role-based UI, reducing the requirement for specialist operators further enhancements to its accuracy, with the ability to track non-natural features with no requirement to “learn” any marker patterns more rugged setup, with witness lenses now mounted internally and a simplified product offering and pricing model, including options to purchase outright or license via annual subscription.Īccording to Nic Hatch, Ncam’s CEO, “This new platform will be the foundation of our technology moving forward and is just the beginning in allowing us to help customers realize their vision without having to worry about technology. It provides seamless and future-proofed integration into the latest Unreal Engine 4 tool set, providing an out-of-the-box solution for high-fidelity realtime VFX. Additionally, the AR Suite software, which in its Lite form, comes bundled as standard. This includes simplified operation, hybrid feature extraction (including natural features, markers and fiducials, and wireless-ready functionality with Mk2 hardware) and global performance and stability enhancements.
#Efilm lite mark for export software
The Ncam Reality 2020 software suite has also been redesigned, with key enhancements in several areas. This means Ncam can now offer fully wireless tracking on a standard RF camera link, opening up many possibilities for the remote production of AR graphics and freeing up rack space for outside broadcast applications. The next generation software now runs on the Mk2 Server and can be mounted on the camera or rig itself, meaning all camera tracking and lens data is computed locally. Previous generation hardware required an ethernet tether to return tracking data to a server running Ncam Reality software. It can be used with a jib, Steadicam, wire cam and drones. It is smaller, lighter and able to mount more flexibly to a greater variety of camera rigs.
First shown as a prototype at IBC2019, the Mk2 Camera Bar uses the Intel RealSense hardware and has been modified to suit more demanding broadcast and film environments. Ncam Technologies, a developer of real-time augmented reality technology for M&E, has introduced its new Mk2 Camera Bar, Mk2 Server and Ncam Reality 2020 software.