Author: Todd Wells
First on my list is the ubiquitous OpenSSH. Combining OpenSSH with key authentication and Keychain to manage passphrases allows me to enter a passphrase once and then run repeated SSH commands or scripts on multiple servers without prompts. I use this ability to manually perform tasks involving a few servers. But some tasks, like reconfiguring the Network Time Protocol daemon, require me to make a change on most of my servers. To accomplish this, I create a command to perform the desired reconfiguration and test it on one or two servers. If it works correctly, I run a script that connects to each server and executes the command. The result is 40 servers reconfigured in a few minutes.
I rely on Yum for installing RPM updates. Since we use multiple Linux distributions, I chose Yum over vendor-specific tools so I could have a single update method. Each Linux box is configured to point to an internal Yum repository, created using scripts I found Googling “distribution-name create yum repository.” After I receive and test an update, I add it to the Yum repository. A nightly cron job on each server installs the update.
Rsync is an excellent tool for moving files. I use rsync to synchronize directories between my production, backup, and disaster site servers. I also find many occasions for specialized file transfer where rsync really shines over FTP or SCP. For instance, we occasionally need to copy large multimedia files to multiple remote locations through a small pipe. FTPing a 300MB file over a 64Kbps frame relay connection would virtually guarantee user complaints and unpleasant conversations with network support. Instead, I rsync the file with the bandwidth limit option set to 3Kbps. This keeps bandwidth available for the remote users and gets the file to its destination in two days.
I use another file mover, lftp, to receive data from and send data to vendors across the Internet. Recent security rules for personal information have caused companies to scramble to meet compliance deadlines. Not surprisingly, our vendors chose several security protocols: SFTP, HTTPS, FTPS, PGP over FTP. Lftp handles all these protocols well, and is easily scriptable. For vendors who require PGP over FTP, I use gnupg. To receive files from vendors, I use the vsftpd FTP server or OpenSSH server with rssh for added security.
Our Linux deployment includes recycled Windows workstations. I personally use KDE, but for older systems with processor and memory limitations Xfce is a better desktop manager. It is lightweight and gives a simple, easily customizable desktop.
I use the same management processes for these workstations as I do for my servers, but GUI users on the systems require an application for remote desktop support. While I like TightVNC for remote desktop connections to servers, I use Vino to support desktop users. Vino is part of GNOME, but can be used with other desktop managers. It works well in low-bandwidth situations, which is important if the user is at a remote site connected by a 64Kbps frame relay connections.
If a Linux system is not performing properly on the network, I utilize several applications to locate the problem. I start with nmap to verify the server is on the network and see what ports are opened and closed. If this quick check does not show me where the problem lies, ntop and tcpdump provide me a more detailed analysis of network traffic. These tools are particularly useful when troubleshooting servers in a DMZ, where legitimate network traffic can be blocked by an improperly configured firewall.
Finally, Nvu makes writing and updating my HTML documentation an easy task. I add screen prints from KSnapshot, part of the KDE package kdegraphics, to make the documentation crystal clear, saving me many late-night calls from support staff.