Home About Partners Public Deliverables Latest News and Results Publications Contact

Latest News
and Results

November 2014: RE@CT final demonstrator at CVMP2014

RE@CT presented its final demonstrator at CVMP2014 in London on 13th & 14th
November.

RE@CT stand at CVMP2014

This video was shown to illustrate the final project test shoot and processing of the data to generate animated 3D models suitable for use in an interactive TV production:

play movie Watch the video [YouTube]

The project also presented at a the EU special session.

 

July 2014: RE@CT video augmentation technology shown at the BBC's Commonwealth Games Showcase in Glasgow

The BBC demonstrated the Augmented Video Player at its Commonwealth Games Showcase event at the Glasgow Science Centre.

 

BBC R&D Venue Explorer

 

March 2014: RE@CT datasets released in conjunction with Eurographics paper

The University of Surrey have released some datasets from RE@CT in conjunction with their paper on 4D Video Textures presented at Eurographics 2014

January 2014: RE@CT releases 3D character app to the public

The first app showing the 3D RE@CT character captured for the first demonstrator walking and jumping has been made freely available and can be downloaded from Artefacto's website.

September 2013: RE@CT demonstration at IBC2013

At IBC2013, the project demonstrated the results from the first prototype on BBC R&D's stand in the Future Zone available on the 'BBC R&D at IBC' 2013 blog.

IBC2013 demonstration

The demonstrations were very successful and were amongst those selected by IBC for their 'What caught my eye' TV reports, courtesy of IBC TV.

June 2013 : RE@CT demonstration at MIRAGE 2013

RE@CT successfully showed  the first prototype of the cultural heritage game at the MIRAGE 2013 conference in Berlin. This video was shown to illustrate the test shoot and processing of the data to generate animated 3D models suitable for use in an augmented reality game:

play movie Watch the video [YouTube]

The project also presented a paper on some of its results, which is available here:
High Detail Flexible Viewpoint Facial Video from Monocular Input using Static Geometric Proxies [pdf]

May 2013: Augmented Reality prototype shown in museum in Brittany

The first prototype of the augmented reality cultural heritage game incorporating RE@CT technologies was demonstrated at the Museum of the Château de Montfort-sur-Meu in Brittany (France) in May 2013. The game was used as part of an exhibition on the history of the Lords of Montfort and allowed school children to learn about local history using an interactive educational application.

play movie Watch the video [mp4]
Video courtesy of and copyright ©TV Rennes – Audrey Helleu

November 2012: Summary of first year's achievements

Specifications

The first major achievement during the first project year was the production of a detailed specification of the involved work packages and the interfaces between them, as well as a specification of the project platform and software modules. This specification work was documented and compiled into deliverable D5.1.1 ‘Specification of Platforms, Modules and Interfaces'. This document will be the basis for further integration of components in the project and will be updated in the course of the project.

The next major activity was the coordination, production and delivery of deliverable D6.1 ‘Detailed Specification of Use Case Scenarios’. This document describes use case scenarios for the technology developed by the RE@CT project, focussed on two use cases which will be developed into demonstrators: the first use case is for a cultural heritage VR or AR application in a public environment, typically a museum or exhibition. The second use case is an application alongside a TV production, either as a live scenario during the TV production or an online application during and after the broadcast. The first of these use cases will be used for the first demo production in the first half of the project.

D6.1 also describes RE@CT’s specific production requirements and a basic layout of the software components that have been already developed or will be during the project, as outlined in the DoW. This specification is more ‘production centric’ as it describes the production flow and interfaces to the RE@CT components for the professional user of the system. It therefore complements the more technical interface specification of D5.1.1.

Data Capture and Analysis

Simultaneous capture of high-resolution 3D video of actor face and whole-body performance

Data capture and analysis

A multiple camera system for acquisition of actor performance at HD resolution has been developed. The system includes a set of 12 static HD cameras, and additional HD cameras on pan/tilt heads to maintain a close-up view of the actor’s face. A head-mounted camera rig can also be used to provide very close-up images of the face. The system can be synchronised with a conventional marker-based motion capture system to provide ground truth data to allow the image-based animation capture to be evaluated. The marker-based system also allows animation of props such as swords to be captured.

Representation of actor performance for interactive animation and realistic rendering

Automatic construction of a motion graph

Automatic construction of a motion graph representation of captured 3D video has been developed to enable interactive animation of the face and body whilst preserving the high-resolution visual quality of the captured data for rendering. Performance analysis algorithms have been developed to transform raw 3D video into a 4D model with temporally-coherent structured representation). In the first phase of the project, 3D video is being integrated into a conventional animation pipeline using skeletal motion controls with the added realism of captured actor performance. Parameterised surface motion control is also being developed, to provide more realistic dynamics.

play movie Watch a video showing a photorealistic 3D animated face created by mapping video onto a tracked 3D proxy head model [mp4]

Authoring tools for production of interactive content based on captured actor performance

Animation engine application interface

An animation engine application interface is being developed to support both offline scripting and interactive animation. This will allow captured actor performance to be re-sampled to produce new facial and whole body animation whilst maintaining video-quality rendering. Algorithms will be developed to control character interaction in the interactive environment and allow stylised motion for artistic expression. This objective will be met by developing new rendering tools using the data representations developed in the project.

Demonstration production of an immersive interactive application

Immersive interactive application

Content is being produced to build an interactive application, demonstrating the RE@CT technology in an augmented reality game scenario for a cultural heritage application. In the first phase of the project, detailed specifications for the user scenarios have been developed and a first prototype application using hybrid motion graph techniques is being planned. The feasibility of rendering a dynamic mesh object using the Unity3D engine has been verified, and this will be used as the basis for the interactive application.