D6.3 Report on Testbed and Interoperability and Workflow Evaluation

Deliverable 6.3 summarizes the testbed used for the interoperability tests, depicts a representative set of interoperability tests, describes the work done in the experimental production and gives an overview of the conducted user evaluation.

For the testbed multiple PCs with a dual Xeon E5-2687W v3 processor and a Geforce GTX 690 graphics board were used to provide enough computational power and the flexibility to use Windows (Mamba and mocha) as well as Linux (Mistika) based host platforms.

Plugins developed by 3FLEX can be used in various stages of the postproduction process. The preprocessing stage extracts and improves depth information from multi view or multi sensor camera setups and includes plugins for Stereo &Trifocal Rectification, Stereo &Trifocal Disparity Estimation, Color Matching and Disparity to Depth Conversion. The postproduction stage uses the available depth information to improve common visual effects tasks and includes plugins for Semiautomatic Disparity Refinement, Entity labelling from Colour and Depth Information, Automatic Clean Plate Creation for Planar or Complex Backgrounds, Advanced Rotoscoping and tools for depth-based keying and color grading. The rendering stage can be used to synthesize virtual views for stereoscopic and autostereoscopic 3D screens and includes plugins for Stereo-, Trifocal- and Video+Depth Rendering and Image+Depth based Inpainting.

The developed plugins can be used in various workflows and hosts as described in the table below.


Interoperability tests were conducted to test the combined usage of multiple plugins. These tests comprise combinations of a few plugins on a single host, complete workflows (monoscopic, stereoscopic and trifocal) on a single host and more complex workflows using Mamba, mocha and Mistika.

To test the plugins in a realistic environment an experimental production was conducted in collaboration with TH Köln. The produced short film features a lawnmower artist creating the letter pi on a lawn of the university campus. The entire shooting therefore had to be outside. The camera was operated by one student with the most experience and by the supervisor. The other students who later worked on the postproduction helped during setup and with taking care of the sound recording. Further student and faculty members were also scheduled to help during the actual shooting. Shooting was done with a trifocal setup and a small stereoscopic GoPro system. Postproduction took place on a system at the university. After ingest and identification of all sequences which were used for the final production Mamba was used to rectify the multi view data and estimate disparities. Visual effects were also applied with Mamba while for a few critical shots mocha was used. Final rendering again took place in Mamba.

The conducted user evaluation used a score system asking for usefulness, efficiency, controllability, learnability and satisfaction. For each component answers could range from 5 for excellent over good, fair and poor to 1 for bad. Results for individual plugins as well as complete workflows were evaluated. Many ratings were in the area of good or at least fair indicating promising results which are not far from product level quality. Some poor ratings usually point out higher speed requirements or the necessity to manually check the quality after automatic processing.

Generally it could be shown that for many types of sequences the 3FLEX plugins can provide a substantial improvement in productivity. Since the applied technology does not make sense for all sequences the operator needs dedicated experience to choose the right tool for a specific job.

D7.4 Planning for exploitation 2

This deliverable reports the results of the activities conducted within WP7 of the 3FLEX project including the market analysis and monitoring as well as the definition of plans for exploitation and dissemination.

The market analysis gives an overview of the post-production market in EU and Worldwide, including a closer look into the Chinese market given its growing importance nowadays. It further analyses the current trends and issues both at the industrial as well as the scientific level. This is complemented with a benchmarking table, and analysis of the impact of the 3FLEX results on the project partners and the overall industry.

Current trends and challenges show that even though 3D has lost momentum during the past years, it is still of great interest for the cinema industry and broadcasters, which still demand on most cost-effective ways for generating 3D content. This is a clear opportunity for the 3FLEX workflow and tools, which precisely aim to reduce costs in 3D productions while allowing more flexibility and creativity during the postproduction process. On the other hand, the analysis of industrial and scientific trends shows that the competition is still pretty limited, even though several tools have appeared in the market that exploit depth information in post-production processes.

Based on the collected information, the final exploitation plan, which updates the initial one from D7.2, is presented.

Finally, this deliverable goes over dissemination activities carried out during the project and the dissemination material created and presents the plan for a future dissemination event.

D7.3 3FLEX publishable report

The EU-funded project “3FLEX: Depth-enabled workflow for 2D and multiview video production” (http://3flex-project.eu) aims to enable existing 2D and 3D production workflows to use and take advantage of available depth information for new degrees of flexibility, cost savings and improved efficiency.

The main result of 3FLEX project is an advanced workflow for flexible 2D and 3D production based on the extension of existing post-production platforms with a set of plugins based on cutting-edge computer vision techniques covering the whole post-production chain from the extraction of depth information over the use of depth information for visual effects to the rendering of different 2D and 3D output formats.

The 3FLEX workflow and tools can be used for a wide range of 2D and 3D productions including fiction, documentary and commercial. On the input side the workflow supports various camera setups including stereoscopic and trifocal cameras as well as image depth sensors. Existing post-production tasks are improved and extended such as depth and object based visual effects or depth based colour grading and finishing. On the output side different 2D and 3D formats can be rendered without considerable effort.

3FLEX impacts the post-production industry and market with differentiating technologies based on video+depth for live action scenes. 3FLEX improves not only the efficiency and flexibility of common post-production tasks but provides also extended creative possibilities. The 3FLEX workflow and tools overcome some of the major issues in current 3D post-production workflows such as native 3D and 2D/3D conversion and improves visual effects for 2D productions thanks to the use of depth maps.

The 3FLEX project and its results have been presented in several events including the 3FLEX booth at the Future Zone of IBC 2015 where near-to-final results were shown. That allowed the project to generate a strong impact in the audience and foster interest for future exploitation of the results.

Read the full report

D5.3 Final Plugins for Depth Image Based Multiview Rendering

Deliverable D5.3 describes the final plugins for depth image based rendering and inpainting.

Identical to the rectification and disparitiy estimation plugins described in D3.3 rendering also follows the approach to provide different plugins for different workflows. Therefore three rendering plugins were implemented: Trifocal Rendering, Stereo Rendering and Video+Depth rendering. The plugins support different input and output formats including spatial multiplex, separate images and the OpenFX multiview extension in Mamba and Mistika. All rendering plugins can render the following formats: 2D, stereoscopic 3D, autostereoscopic 3D mixed for Tridelity and Alioscopy displays as well as multi view image output (5 and 8 view) for external mixing software.

For image and depth based inpainting two plugins have been implemented. One plugin works with individual camera/depth input pairs and provides an automatic clean-plate which can be used later to fill holes during the depth image based rendering. The second plugin can be used after the rendering to fill already existing holes. It has been designed for a trifocal spatial multiplex workflow.

D4.7 Algorithms for depth-enabled postproduction tools

This deliverable reports the activities carried out in the different tasks defined in the working plan of the activity WP4 “Depth enabled post-production tools”.

This document presents a set of tools that exploit depth information on different usual post-production tasks. Depth-enable post-production tools will allow editing operations that go beyond the limitations of current video post-production. In particular, HHI describes the semi-automatic depth map refinement (T4.1) and EUT focuses on advanced rotoscoping (T4.2), clean-plate creation for planar backgrounds (T4.3), entity labelling (T4.4) and clean-plate creation for complex scenes. All of these techniques will significantly help the user to manipulate visual content. Finally, it also presents new nodes that allow to take advantage of depth information for visual effects in 2D and 3D productions.

The post-production tools introduced in this document have been implemented as plugins for the mocha platform by Imagineer Systems and the Mistika/Mamba platforms by SGO.

D6.4 Acquisition of Test Material for Experimental Production and Interoperability Tests

Deliverable D6.4 provides an overview of additional test material generated or shot for the 3FLEX project.

Using the open source movie Big Buck Bunny a set of sequences along with ground truth information could be generated to facilitate the quantitative evaluation of the developed plugins.

A joint shooting with a company interested in the 3FLEX technology took place in a church in Leipzig using a new trifocal setup with a RED Scarlett camera along with two Indiecam GS2K satellite cameras.

An ARRI Alexa with two sinaCAM satellite cameras was used to shoot footage for the experimental production taking place at TH Köln.

D3.3 Final Plugins for Depth-Enabling Preprocessing

With deliverable D3.3 final versions of pre-processing plugins are available. All plugins now follow a common structure and are tested on Mamba/Mistika, Nuke and Natron. For Mamba/Mistika the new OpenFX multiview / -part extension can now be used. In addition spatial multiplex is still supported for all platforms.

Stereo and trifocal rectification plugins have been updated for better interoperability with different host platforms. Stereo rectification modifies only one selectable input image while trifocal rectification always warps the satellite images and keeps the centre image unchanged. Parameters of both versions have been aligned and simplified. For trifocal rectification an additional possibility to manually define shift parameters has been added. Results have been compared against a ground truth sequence and show no significant deviation.

Disparity estimation now has also been split into two separate plugins for stereo and trifocal workflows. Parameters have been aligned between both versions and simplified. Results have been evaluated against ground truth and show a high percentage of identical or almost identical results.

The depth filtering plugin has been updated in terms of performance and usability while the underlying filter has been maintained.

Disparity to depth conversion provides a new scaling parameter and has been updated to the common plugin software structure.

A colour matching plugin has been implemented to cope with significantly different colours in typical trifocal setups due to usage of different cameras. The method and parameters are described and the plugin has been evaluated in some realistic footage.

3FLEX flyer

The 3FLEX Consortium has produced a flyer to present the project. Know about how 3FLEX is enabling existing 2D and 3D production workflows to use and take advantage of available depth information for improved efficiency and flexibility.

Discover the main applications and details of plugins being developed covering the whole postproduction chain from preprocessing over visual effects to finishing.

Download the flyer now!

D5.2 Draft plugin for image+depth based inpainting

This deliverable reports the activities carried out in the task T5.3 “Image and depth based inpainting in multiple views” as defined in the working plan of the activity WP5 “Depth image based multiview rendering”. The expected outcome of this task is a plugin for the Mistika/Mamba platform by SGO. The plugin implements an inpainting algorithm applied to fill in generated holes (disocclusions) of new rendered views.

In this document we first describe the algorithm. Then we proceed with the description of the plugin installation and usage. Some evaluation results are presented in later sections of the report. Finally we state our conclusions.