In August 2005, engineers in the phytoplankton ecology program at Mote were putting the finishing touches on a new red tide detector: an optical phytoplankton discriminator, affectionately nicknamed the BreveBuster.
The BreveBuster works by shining deuterium and tungsten light through a seawater sample and comparing the light absorption by particles in the water to the known absorption fingerprint of red tide. After performing some complex calculations, the DOS-based Persistor CPU transmits the results through an RS-232 port to an RF modem, cell phone modem, or Moxa Moxa serial to Ethernet converter.
On the other end of the transmission was a desktop PC, an ancient modem, some Visual Basic code, and a text logfile -- not a bad research and development setup, but not the ideal environment for collecting and disseminating data from a planned Gulf of Mexico network of BreveBusters. It was time to give a world-class marine research group a world-class IT infrastructure.
When I joined the Mote staff as a data technology specialist, topping my list of tasks was the construction of a secure, environmentally conditioned server room -- no more servers sitting on staff members' desktops. Working with the lab's facilities staff, we were able to convert an empty classroom into the SO COOL room: The Sarasota Operations of the Coastal Oceans Observation Laboratories. Combining a robust server closet with an operational headquarters for oceanographic missions coordinated by Mote, the SO COOL room was designed to showcase the data collected by the BreveBusters. But before we could showcase, we needed some hardware.
For serving up databases and providing home directories for our staff we chose a Dell PowerEdge 2850 with 1GB of memory, twin 100GB RAID-1 drives for the operating system, and 300GB RAID-1 drives for applications and user space. We installed Web services on a Dell PowerEdge 1850 with 512MB of memory and two RAID-1 133GB drives. Backups and communication tasks were delegated to twin Dell PowerEdge 850s, each with 512MB of memory and 71GB disks. A Dell PowerVault 122T tape robot and four American Power Conversion UPSes completed the package.
We initially settled on Red Hat Enterprise Linux for our operating system and purchased a one-year subscription. Little did we realize what impact that decision would have on Mote's network.
Shortly after installing RHEL-4 on our new servers, I initiated software updates through the Red Hat Network. It didn't take long for the lab's central IT staff to come knocking on my door asking why I was hogging the entire Internet connection. Having recently joined the research staff, I was unaware that Mote's Internet needs were served by a 1.5Mbps T-1 link. The updates I had just kicked off were consuming all of the available bandwidth. We needed to make some changes.
After chatting with the IT director I discovered that Mote had a 45Mbps link to Internet2 scheduled to be activated in a few days. Could the speedy connection provide some relief? Yes, if we were willing to switch operating systems to CentOS, an enterprise-class Linux distribution based on Red Hat. Unlike RHEL, which requires commodity Internet access to reach the Red Hat Network servers, CentOS mirrors are plentiful on Internet2. By switching to CentOS we would be able to utilize the less congested I2 connection and not inconvenience the lab's users. We quickly rebuilt the servers with CentOS, and Mote's networking staff breathed a sigh of relief.
With the hardware and operating systems in place, it was time to turn our attention to the applications that collected and displayed data from the BreveBusters. The original installation utilized a Windows XP desktop, an old US Robotics modem, and a text file for logging. A Visual Basic application fired off every few hours and built a graph of collected data. This configuration was fine for testing but didn't lend itself to sharing and archiving the red tide data.
BreveBusters can be deployed just about anywhere -- buoys, channel markers, piers, and autonomous underwater vehicles -- so our communications code had to be able to handle a wide range of inputs. But "versatile" doesn't mean "duplication," so I didn't want to write separate applications for each type of deployment.
If a broadband connection is available, we use a Moxa serial-to-Ethernet converter to transport the data over IP back to our server. For remote sites, an RF modem, cellular modem, or Iridium satellite transmitter handles the communications chores. Utilizing the Device::SerialPort and IO::Socket libraries for Perl, I was able to build an application that listens for incoming BreveBuster communications on both tty ports and network sockets.
The BreveBuster sends a string of 14 parameters during each transmission, and typically reports once every hour. Extracting data from a text file for one BreveBuster wasn't difficult, but with 300 planned deployments in the Gulf of Mexico, it was obvious that we required a more flexible solution. A nine-table MySQL database on the PowerEdge 2850 provided just what we needed.
Having the data housed in a relational database made building a dynamic Web site for control and visualization simple. A mixture of PHP, HTML, Cascading Style Sheets, and jpGraph provides staff and researchers with the ability to instantly see status changes in deployed BreveBusters, research historical trends, and graph selected parameters -- a significant improvement over that XP desktop and text log file.
12 months after going live with the new site, we've logged more than 9,000 BreveBuster reports, served up 500,000 Web pages to 72,000 sites, and haven't had a single systems failure. We're partnering with 10 other research institutions in the Gulf of Mexico Coastal Oceans Observing System, and will be providing BreveBuster data to Ocean.us through a SOAP-based service. While we haven't yet discovered a remedy for red tide, Linux and the open source community provided a potent prescription for curing our data distribution dilemma.
Bob Currier is a data technology specialist at Mote Marine Laboratory in Sarasota, Florida. He recently retired from a 20-year career at Duke University, where he served as director of data and telecommunications. His credits include features, reviews, and opinion pieces in Network World, ITWorld.com, Smart Computing, and PC Computing.