Our Performance Capture Studio is a collaborative, experimental, cross disciplinary performance space.
Research directions
Investigating new media technologies for live digital performance and cinematic pre-visualisation.
Our core pipeline leverages a dedicated motion capture volume and the Unreal Engine live link capabilities, while continually integrating peripheral technologies for evolving digital performance.
The WAAPA Performance Capture Stage hosts myriad directions for rapid, iterative development of digital creative works through collaboration between performing artists and technical specialists.
Collaboration and development
Our space is designed for artists and tech experts to work together easily and quickly.
Collaborators across disciplines can rehearse and collaborate in response to real-time performance feedback to develop bespoke processes and performance outcomes. This rehearsal landscape also affords traditional and immersive screen-based projects to pre-visualise scenes, adapting and capturing performance through multiple stages of the production process.
Digital asset creation
Digital integration of human performance.
A central focus explores the creation of digital assets that are scaled and retargeted to performers’ functional anatomy. This process aspires to enable performers direct interaction with virtual co-performers and performance environments, minimising post-production requirements and enabling fully live, unedited performance opportunities.
Current infrastructure
- Vicon motion capture volume: 4m x 4m x 3m capture volume for up to four performers including finger tracking and customised tracked props.
- Metahuman animator facial performance capture system: Integrated facial and motion capture pipeline enables full performance capture possibilities
- Dedicated open-plan Unreal Engine developer suite: Fully networked co-working space encouraging and facilitating essential interdisciplinary work.
- Rapid prototyping engine templates: Pre-built frameworks allow creatives to quickly adopt and test new ideas without starting from scratch, speeding up the creative process.
- In-house propriety character and prop performance capture pipelines: Performance capture pipeline integrated with engine templates enables real-time capture and previsualisation of virtual environments. Rehearsals and rushes can be blocked and recorded in real-time.
- Live audio integrations into Unreal Engine for visualisation and spatialisation over MIDI interface. This technology allows for live sound to be incorporated into digital environments. The sound can be visualised (seen) and spatialised (heard in 3D space) within Unreal Engine, enhancing the immersive experience.
- Oculus Quest 3 and HTC Vive Headsets.
For more information, please contact Dr Luke Hopper.