Dr. William Bares, associate professor of Computer Science at Millsaps College, has developed new software to help filmmakers more efficiently explore alternative ways of placing the camera to film scenes in computer-generated virtual worlds or mock-ups used to plan live action filming. The software could one day be made to work alongside existing motion-sensing technologies such as those used in films such as Polar Express (2004), Beowulf (2007), Monsters versus Aliens (2009), Avatar (2009), and Tintin (2011).
Dr. William Bares demonstrates use of virtual camera software
Motion sensing control of the virtual cameras in filmmaking makes it possible for anyone skilled in operating a real camera to control a virtual camera by moving and turning the hand-held display screen just as they would a real camera. Dr. Bares and colleagues from Rennes, France and Udine, Italy created software that proposes several suggested camera viewpoints from which a filmmaker can begin an upcoming shot.
With a quick glance at the handheld touch screen, the filmmaker can instantly compare a variety of distinct angles and distances. On touching a suggestion's image, the filmmaker instantly begins work by either recording from the selected viewpoint or by adjusting (or even ignoring) the suggestion to improve its composition. In offering suggestions for the next shot, the software chooses alternatives which follow conventions of cinematography such as showing a character facing the same direction as in the previous shot.
Dr. Bares says that this software keeps creative human filmmakers in complete control of the final product enabling them to explore their alternative cinematic visions more efficiently. This work has been presented at ACM Multimedia (November 30, 2011 in Scottsdale, Arizona) and at ParisFX (December 15, 2011 in Paris, France). The work is patent pending and anyone interested in technology transfer should contact INRIA-Rennes, France.