For films and video games, actors and actresses often don motion capture suits to ensure their computer-generated characters have believable motions. But for the rest of the animal kingdom, the equipment to make this happen is expensive and lots of markers have to be attached to translate movements from reality onto the screen.
Luckily for us, researchers from the University of Bath, UK, have done this hard bit for us, digitizing the movements of 14 different breeds of dog (recruited from a local animal shelter) wearing special doggy motion capture suits. Using this data, the team have then created a computer model that allows 3D digital information of unseen dogs to be captured and translated onto a screen without the need for a studio set-up. And yes, you could even use the technology to digitize your dog.
“This technology allows us to study the movement of animals, which is useful for applications such as detecting lameness in a dog and measuring its recovery over time,” Sinéad Kearney, PhD researcher in the university’s motion capture research center (CAMERA), said in a statement. “For the entertainment industry, our research can help produce more authentic movement of virtual animals in films and video games. Dog owners could also use it to make a 3D digital representation of their pet on their computer, which is a lot of fun!”
Kearney and her colleagues’ computer model relies on the dataset gathered from some canine friends, including lanky lurchers and squat pugs, who were filmed in their motion-capture suits trotting and jumping. This information was then used to train a model that can accurately predict and replicate poses of other dogs imaged by a single RGBD camera. As well as recording the colors red, green, and blue (RGB) in every pixel of an image, like a normal digital camera, RGBD cameras also take note of the distance from the camera for each pixel.
“This is the first time RGBD images have been used to track the motion of dogs using a single camera,” Kearney said, “which is much more affordable than traditional motion capture systems that require multiple cameras.”
Having remotely presented their new technology at the Computer Vision and Pattern Recognition (CVPR) conference on June 14, the team hope that in the future they will be able to extend their dataset to produce more accurate results. In fact, the team have already begun to test their model on CGI’s of other four-legged animals, such as horses, cats, lions and gorillas, to help make these have a more animal-like quality.
“Our research is a step towards building accurate 3D models of animal motion along with technologies that allow us to very easily measure their movement,” Professor Darren Cosker, director of CAMERA, said in a statement. “This has many exciting applications across a range of areas – from veterinary science to video games.”