D6.3 Report on Testbed and Interoperability and Workflow Evaluation

Deliverable 6.3 summarizes the testbed used for the interoperability tests, depicts a representative set of interoperability tests, describes the work done in the experimental production and gives an overview of the conducted user evaluation.

For the testbed multiple PCs with a dual Xeon E5-2687W v3 processor and a Geforce GTX 690 graphics board were used to provide enough computational power and the flexibility to use Windows (Mamba and mocha) as well as Linux (Mistika) based host platforms.

Plugins developed by 3FLEX can be used in various stages of the postproduction process. The preprocessing stage extracts and improves depth information from multi view or multi sensor camera setups and includes plugins for Stereo &Trifocal Rectification, Stereo &Trifocal Disparity Estimation, Color Matching and Disparity to Depth Conversion. The postproduction stage uses the available depth information to improve common visual effects tasks and includes plugins for Semiautomatic Disparity Refinement, Entity labelling from Colour and Depth Information, Automatic Clean Plate Creation for Planar or Complex Backgrounds, Advanced Rotoscoping and tools for depth-based keying and color grading. The rendering stage can be used to synthesize virtual views for stereoscopic and autostereoscopic 3D screens and includes plugins for Stereo-, Trifocal- and Video+Depth Rendering and Image+Depth based Inpainting.

The developed plugins can be used in various workflows and hosts as described in the table below.

table

Interoperability tests were conducted to test the combined usage of multiple plugins. These tests comprise combinations of a few plugins on a single host, complete workflows (monoscopic, stereoscopic and trifocal) on a single host and more complex workflows using Mamba, mocha and Mistika.

To test the plugins in a realistic environment an experimental production was conducted in collaboration with TH Köln. The produced short film features a lawnmower artist creating the letter pi on a lawn of the university campus. The entire shooting therefore had to be outside. The camera was operated by one student with the most experience and by the supervisor. The other students who later worked on the postproduction helped during setup and with taking care of the sound recording. Further student and faculty members were also scheduled to help during the actual shooting. Shooting was done with a trifocal setup and a small stereoscopic GoPro system. Postproduction took place on a system at the university. After ingest and identification of all sequences which were used for the final production Mamba was used to rectify the multi view data and estimate disparities. Visual effects were also applied with Mamba while for a few critical shots mocha was used. Final rendering again took place in Mamba.

The conducted user evaluation used a score system asking for usefulness, efficiency, controllability, learnability and satisfaction. For each component answers could range from 5 for excellent over good, fair and poor to 1 for bad. Results for individual plugins as well as complete workflows were evaluated. Many ratings were in the area of good or at least fair indicating promising results which are not far from product level quality. Some poor ratings usually point out higher speed requirements or the necessity to manually check the quality after automatic processing.

Generally it could be shown that for many types of sequences the 3FLEX plugins can provide a substantial improvement in productivity. Since the applied technology does not make sense for all sequences the operator needs dedicated experience to choose the right tool for a specific job.

Leave a Reply

Your email address will not be published. Required fields are marked *