projects, tech stuff, musings

character animation for virtual reality

Character animation for virtual reality has been a long standing interest of mine since I first discovered I could upload animation files to Second Life way back in 2005. Given how much communication is non-verbal, I have always felt that there is a lot of room for improvement in 3D character animation in virtual reality, so over the years I have experimented quite extensively with enabling better character animation.

bvhacker the notepad equivelent for bvh files
bvhacker: one of my early character animation projects

where am I now?

I have ordered a 50 camera mocap system from Qualisys motion capture systems. I have also visited Qualisys HQ in Gothenburg, Sweden. After a very productive meeting, we have informally agreed to work together in beta testing their software and identifying desirable features as they develop live, full body character animation streaming capabilities in the near future.

I recently received government approval for setting up my motion capture studio on the tropical island paradise of Koh Phangan, Thailand. I have set up a public limited company called Virtuality Co. Ltd.
Due to the particular scheme I have been awarded approval for (The Thai Government Board of Investment), I have a number of privileges, including some very generous tax breaks and the ability to invite ‘visiting professionals’ to Thailand to collaborate on projects. With these plans in mind, I established a small co-working space on the island a couple of years ago.

Chaloklum Koh Phangan - home of Virtuality Co., Ltd

Location of Virtuality Co., Ltd – Chaloklum, Koh Phangan, Thailand

I also have an agreement with a dance studio on the island in place already, whereby I will use the studio for motion capture. Regular contact with the studio should also provide me with an ongoing source of performer talent.

the plan

I intend to specialise in producing animation data for High Fidelity, the real time social VR platform.

phase 1 (the coming year)

Initially, I plan to produce regular FBX animations (bundled with associated avatar-animation.json files) and start selling them on the High Fidelity marketplace.
I also plan to offer custom / bespoke animation services for HiFi users by offering highly competitive mocap and animation data processing services. Because of the government approval privileges I have, coupled with the low costs of running a business in Thailand, I very much expect to be highly competitive.

phase 2 (the next year or two)

This is where it gets really interesting! About a year ago, a researcher called Daniel Holden presented a paper at SIGGRAPH describing an open source, AI based character animation system called the Phase-Functioned Neural Network for Character Control (PFNN). In essence, it’s a novel technique that uses a neural network to calculate the next animation frame over a wide range of complex situations, using assets with low (compared to FBX animation data) CPU usage. His results are outstanding – see the demo movie below or the overview presented here for more detail.

I’ve looked at the possibility of implementing a PFNN 3D character animation system in HiFi and completed some initial investigations (e.g. detailed analysis of the PFNN inputs, identification of differences in character armature – see here).
I have also been in discussion with other developers who have indicated that they’d be interested in participating in the implementation of the PFNN for HiFi.

phase 3 (timescale not yet determined)

My mocap system will be capable of capturing up to six actors simultaneously. Working with Qualisys and building on HiFi’s Vive mocap functionality, I plan to enable live, full body character animation streaming of performances; music, dance, theatre, stand up comedy, etc. The island where I live (Koh Phangan, Thailand) is a hotbed for musical talent in particular.

Koh Phangan musician

Koh Phangan musicians

further ahead

I also have a number of other ideas I’d love to get working on too:
~ Automatic locomotion gait adjustments for avatar moods, age and gender (This paper gives a great starting point)
~ Implementing a natural body language animation system for NPCs that reflects the sentiment / emotional content of speech
~ Generally attracting location independent future tech enthusiasts to come join me working from a tropical island paradise
But if I’ve learnt anything over the last three years or so, it’s that all these ideas of mine take time, and that I have to be patient!

next steps

Qualisys need 6 to 8 weeks to manufacture my motion capture system. Concurrently, I am in the process of organising the paperwork for the import of the equipment to Thailand, a process that could also take a couple of months.

I have already completed some initial investigations into demoing the PFNN in JavaScript in High Fidelity. I am also looking into the possibility of implementing the PFNN player as a C++ plugin/dll for HiFi’s Interface for the case where the PFNN is jsut too demanding for JavaScript. My progress on the JavaScript demo can be found on my github page here.

PFNN incomplete implementation in High Fidelity

The current state of the JavaScript PFNN demo for High Fidelity

Once the studio is up and running, I intend to record a few hours of locomotion data for training the PFNN using a HiFi character armature. Both the learning software and the playback software will need to be adapted / re-written for HiFi joint structure.
For the meantime I have manually re-targetted the existing PFNN demo’s bvh files to a HiFi armature using MotionBuilder. This data is ready to enable the development and training of an adapted PFNN for HiFi and is available here.

Before that though, the first step would be to get a working demo up and running in High Fidelity. The output from the existing demo could be re-targetted (in code) to the HiFi armature. I have analysed the armature differences already (see analysis here), I don’t think the re-targetting would present many problems.

want to get involved?

I plan to develop the PFNN for High Fidelity as an open source project, starting immediately. I have a small budget for development, as although I am able to take on the implementation of the PFNN myself, I feed there are far more talented developers out there who could do get the job done far quicker and more eloquently than me. If you would like to have a chat about getting involved, please drop me a line below.

One Response to character animation for virtual reality

  1. Avinash Mangipudi says:

    Hello Dave,

    i am interested in your work..
    im a freelancer from India.
    Would like to visit you in Thailand to know more.
    Then i can understand how i can contribute.
    Can you please suggest how i can get to you.

    Regards,
    Avinash

Leave a reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

There is no content to display.