Author: Rob Reilly
GestureStorm, from Cybernet Systems Corp., processes the studio video feeds, recognizes the movement of a weatherman’s hand, and then sends cursor movement commands to the Windows-based Millennium radar controller. Although Cybernet has an interface to the Vipir, my local station hasn’t rolled it out yet.
The GestureStorm system has its roots in the research done by Cybernet’s vice president of research and development, Dr. Charles Cohen. Cohen studied dynamic gesture recognition for his Ph.D. at the University of Michigan in the mid 1980s. Back then, he used the shortlived and high-powered Transputers for the positioning calculations and video processing on a pan/tilt camera system. He developed algorithms in C under Unix to make the camera track a flashlight (called a tagged recognition system). His work was influenced by Michigan professor Dr. Daniel Koditschek and his robot research projects.
Cohen also worked on crane excavators that used gestures to operate the machine. A machine could recognize a certain gesture as “come here,” another as “go away,” and so on.
Over time Cohen refined his projects into what are known as untagged recognition systems. Untagged systems dispense with the flashlight or other reference points attached to the arm and simply process the outline of the human in the video frame.
Weathermen wave at green screens
Around August of 2003, the news director at WKMG was looking for a more natural way for the on-air weather personality to control the FasTrac Millennium system during fast moving storms and breaking weather stories. Cohen’s gesture recognition algorithms were a perfect fit.
Until recently, weather newscasts have always been scripted, much like a PowerPoint presentation. The weather personality had to memorize the script and coordinate his commentary to what happened on his monitor, as the show progressed. But live on-air tornado and severe weather coverage don’t work well using a scripted format.
Frank Torbert, a 30-year broadcasting veteran and WKMG’s director of operations, contacted Cybernet after seeing some of its gesture products on the Web. Cybernet product manager Glenn Beach got involved, and within a few weeks had a working prototype in the studio, ready for testing.
“They brought the system in, and within a half hour, it was up and running,” Torbert told me when I visited him recently. The story goes that one of the weathermen gestured in front of the system for a few minutes and was so impressed that he used it during the next three weather newscasts, live and on-air. Talk about time to production!
Several factors contributed to the easy integration of the new GestureStorm system with the Millennium radar system:
- The studio uses a green screen behind the weather newscaster
- The weatherman’s hands are usually away from his body when he’s on the air
- Gesture initiation can be controlled by a remote garage door clicker-type device
- Studio lighting is always perfect
- Studio cameras provide high-quality and high-resolution video feeds
The green screen simplifies filtering out the video “noise” and makes it easy to outline the human form. Having the hands away from the body makes their outline crisp for easy recognition. To cut down on false positives (and sending the wrong commands) the weatherman uses the clicker to tell the GestureStorm software when to pick up a gesture. Studio lighting …well, it makes everything look perfect. Finally, even though the system could use a USB camera, pulling in the studio camera feed is just plain obvious.
Torbert showed me around the GestureStorm system on my visit to his studio. The machine is a standard AMD desktop box running Red Hat and GNOME. Video comes into the capture card (typically a Matrox Meteor) from the green-screen-facing studio camera. A little box with the Cybernet name on it functions as the remote clicker receiver. I think the clicker was an actual garage door remote. The whole thing fit right in with the dozen or so other PCs and monitors that filled the control room.
To show how the system worked, Torbert stood in front of the green screen and waved his arm. There on the GestureStorm monitor window was his outline, in black. Anytime he stayed stationary for more than half a second, a little box would roughly outline his hand. Pretty slick.
During my visit, a group of schoolchildren, possibly first graders, was also touring Studio A. At one point, a couple of them stood in front of the green screen. Just as before, the system picked out a stationary hand. Of course with more than one child, the little box jumped back and forth a little as it recognized hand shapes.
Staying up front
Baron Systems, which sells the Millennium and Vipir radar systems, worked closely with Cybernet on the interfaces to their products. It also furnished hardware during Cybernet’s GestureStorm development period. Torbert said that WKMG would be interfacing the GestureStorm system with Vipir radar shortly. Apparently, getting the two to talk was a little more complicated than the Millennium application. The development will keep WKMG on the cutting edge of weather forecasting.
Cybernet’s Cohen said that his company has versions of GestureStorm software that run on Linux or Windows machines that are currently used in six systems. The company have no plans to release its software as open source, preferring instead to license the technology through a gesture software developer kit (SDK).
Rob Reilly is a consultant, writer, and commentator who advises clients on business and technology issues. His Linux, portable computing, and public speaking skills related articles regularly appear in Linux and business media outlets.