This project will be a conduit of collaboration and communication between multidisciplinary artists and their audience. We have all been in our growing bubbles of isolation, I feel communication though physicality is of importance to the stability of the social fabric. Victoria has been though a unique challenge, though this project we as artists can react positively and with great innovation to the changing social landscape. I will motion capture performances of artists that have had their audiences and revenue decimated though closing of venues and restrictions on travel. Releasing these performances online will allow for the artists to connect to their audience with timely, visually unique and intimate experiences. I will develop a mobile workflow for my existing skills so I can send my motion capture hardware to the artists to capture themselves.


The latest period will allow me to remotely collaborate with specifically Victorian Performing Artists. I will purchase all necessary remaining hardware and software for specifically the remote performance capture of body and facial data. I will then send the capture hardware (rokoko smartsuits, HTC vive trackers, iphoneX for face capture and laptop for data ingest) to the artists to be recorded with my remote dramaturgy, along with audio capture. I will interpolate the spatial, skeletal and facial blendshape data with Houdini to procedurally animated visual style that suits the artist. These styles will from completely abstracted physical simulations using Vellum, virtual hair, fluid and cloth to the more likeness representative creation of a high definition digital double avatar using Reallussion Character Creator.


I am coming to the completion period of my Sustaining Creative Workers Initiative Grant during which I researched and developed a workflow for this project as well as all necessary skills. During this period I have moved from a 3d motion designer to being professionally competent in coding with Unity and C#, Unreal Blueprints, Maya LT and Houdini as well as Apple ARKit for facial tracking. I have designed a Vive tracker 6-point motion capture rig along with purchasing a GTX2080 for ultra-fast animation renders using RedShift, 64 GB ram, two Apple TrueDepth cameras and relevant software.



 © Luke Constable (2020)