The CIS Linux benchmark: Security best practices for Red Hat and Fedora Core

616

Author: Bruce Byfield

The Center for Internet Security
(CIS) is a non-profit association for the promotion of computer security. Its
members, largely North American, range from IBM and Motorola to universities
and individuals. Through the consensus of members, it develops a list of
best practices for Windows, Linux, Solaris and Free BSD, as well as Cisco routers, Oracle databases, and Apache Web servers. These best practices are incorporated into benchmark scripts and accompanying PDF guides for interpreting the results and improving security with a series of actions and scripts. The CIS Linux Benchmark provides a comprehensive checklist for system hardening.

Because the CIS has limited resources, its current Linux Benchmark is designed for only Red Hat Enterprise Linux 2.1 and 3.0 and Fedora Core 1, 2, and 3. Although CIS suggests that derivatives of these distributions may also be able to run the Benchmark, for now its usefulness is limited. However, even if the Benchmark itself won’t run with a particular distribution, the information in the accompanying PDF file can be adapted to most distributions with a minimum of effort and expertise.

If you are lucky enough to be able to use the Benchmark directly, it provides an objective standard for talking about Linux security. The results may shake up your ideas about how secure your Linux box really is.

Using the Benchmark

The best time to run the Linux Benchmark is immediately after installation, so you can be reasonably sure that your system is secure from the start. The Linux Benchmark is available in an .rpm package as a free download. Once the package is installed, running the command cis-scan provides a non-obtrusive test of the current system. The Benchmark gives an immediate raw score on a ten-point scale. The Benchmark also writes a date-named log that breaks down the raw score into a detailed series of positive and negative assessments.

These results provide an objective frame of reference, but not an absolute one. That is to say, if a system scored a perfect 10.00, the results would detail its configuration in a way that anyone could confirm, and you could safely say that your system followed all of CIS’s best practices. However, you could not say that the system was immune from attack. Web and mail servers, and probably other programs, could open security holes not covered by the Benchmark. And the CIS Benchmark can do nothing to guard against sloppy practices such as using the root account for everyday computing.

Nor can the CIS Benchmarks be used to make a meaningful comparison between Linux and Windows installations. The two Benchmarks measure different vulnerabilities, so the results are not comparable across different platforms.

Once you run the Benchmark, open the log and the PDF file. Items are listed in the log in the same order as in the PDF file. To harden the current system, scan the log for negative items, then turn to the corresponding sections in the file to correct them.

Before beginning the process of hardening, you will also want to download the Bastille and sysstat packages so that they are available when you need them. Both are listed in many .rpm repositories. Bastille also requires the installation of either the Perl-Tk or Perl-Curses package.

Bastille is especially important to the Benchmark. It runs an interactive tutorial for securing a system, explaining options and why you might choose them. Some duplication exists between Bastille and the CIS Benchmark, such as the use of warning
banners for intruders, but the two are separate enough that not running Bastille can affect a raw score by 0.60 out of 10, almost twice as much as any other single item.

The Benchmark PDF file is divided into 10 sections, covering dozens of topics varying from how recently the system was patched, to minimizing the xinetd and boot services, to the setup of file permissions, system authentication, user accounts, and environments. Installing and running Bastille is emphasized, but CIS also provides its own additional scripts and advice. The file ends with a listing of anti-virus software (making clear that this service is only for servers that interact with Windows), as well as a list of minor security steps that CIS members believe have a minimal effect on overall security.

Mindful that some users may be learning security basics from the file, CIS warns at the start of the importance of backing up key files and data, and, like Bastille, provides brief guidelines to help users decide whether they want to implement each step. Some might argue that the file has a few omissions — for example, it makes no mention of packages such as cracklib that can be installed to prevent users from choosing too simple a password. It also emphasizes configuration through scripts rather than through the manual editing of files that many Linux users prefer. The time needed to work through it depends on the Benchmark results and your existing knowledge of security, but give yourself at least two to three hours, including running Bastille.

When you are curious about your progress, or have finished hardening the system, run cis-scan again for a revised score and log.

Using the Benchmark to improve security

To get a sense of how secure a Linux system is immediately after installation, I did several installations of Fedora Core 3 in the three default settings: Personal, Workstation, and Server. In each case, I used the default partitioning and package selection. I enabled the firewall and set the SELinux setting to Active. For Personal and Workstation installs, I included a graphical login, CUPS, Kudzu, hotplug, and the compilers that are installed with the Development Tools selection — in short, the sort of tools that the average desktop user is likely to want. For Server installations, I included CUPS and enabled remote login via SSH as well as Web server and mail server options. For half the server installations, I installed a graphical desktop.

The Benchmark results struck a severe blow to my smugness as a Linux user. The highest initial score was for a sever without a graphical desktop, and that was only 5.89. The lowest, surprisingly, was not for the Personal installations, but a 5.48 for a Workstation install — presumably the choice for a computer on a network that needs more security than a personal computer. The Personal installation came in at
5.62.

These scores hardly indicate wide open systems. All the installations got positive scores for not having any remote services enabled by default. Moreover, the Personal and Workstation installs both had remote logins disabled. Nor were there any world-writable files in the home directories. All the installations, however, lost points for not having the SSH client configured, automatic mounting and remounting of
drives by ordinary users, and not setting a reasonable expiry date for passwords. Still, the scores were far from the high security that I expected.

Working through the PDF file file, I boosted the scores to 7.36 for a Workstation install, 7.50 for a Personal install, and 7.69 for a Server install with no graphical interface. The scores for the Workstation and Personal installs included CUPS, Kudzu, and compilers, but not automatic mounting of removable drives — a sacrifice that seemed reasonable for the sake of security, considering that automount has only been
standard on Linux installations for a couple of years, and I was used to it.

Initially, the Server install also included CUPS. However, removing CUPS and rerunning Bastille and the CPU Benchmark to make a battery of changes that included prohibiting direct login to the root account and enabling process accounting, and setting the machine to run only as a mail server, through a series of trials and errors, I succeeded in boosting the score to 9.17. On a Personal or Workstation installation, this final score would have had a high cost in convenience, but it left a functional dedicated Server.

At the end of the process, I had improved security rating on the Workstation and Personal installations by more than 40 percent, and on the Server installation by more than 65 percent. Since criteria seem weighted in the Benchmark, these improvements seem meaningful. At the cost of a couple of hours’ work, I had significantly increased the security on all the installations.

Other uses of the Benchmark

Besides improving security on individual systems, the Benchmark also makes talking about the effect of system changes easier. By running cis-scan after each configuration change, you can get an exact measurement of how the changes affect system security. For example, a Fedora Core 3 server installation without a graphical desktop receives a raw score that is 0.35 higher than an otherwise identical installation with one. Similarly, removing all setuid and setgid settings increases a score by up to 0.28 on Fedore Core 3. For single items, these are large scores. Personally, I doubt that items can be quantified so exactly, but the figures reflect how much these increasingly standard configurations affect security.

Similarly, if you have ever suspected that Linux security is being sacrificed to appeal to refugee Window users, the CIS Benchmark provides a yardstick for measuring the differences between releases. Security consultant Dan Razzell of Starfish Systems provided for me the Benchmark results for a Red Hat 7.2 server installation with a graphical desktop tested in 2002. It received a rating of 6.09. A similar Fedore Core 3 server installation today scored 5.62. A look at the Benchmark log would provide detailed information about exactly how security has been relaxed in the last three years. Even allowing for minor differences in the two systems, the security of Red Hat and Fedora really does seem to be declining. Instead of voicing a vague suspicion, you can use the CIS Benchmark to provide concrete proof of such changes.

An exception for the sake of education

Besides the limited coverage of Linux distributions, the largest drawback to the CIS Benchmark is its license. Although the Benchmark is available as a free download,
the license for non-CIS members is not free. Specifically, the license limits installation to a single computer, and includes an agreement not to reverse-engineer the code.

According to John Banghart, the director of benchmark services at CIS, these restrictions are intended to “create a value proposition for companies becoming members with our organization. These membership dollars help keep us in business. In addition, we enter into broad strategic partnerships that permit others to use the Benchmarks and tools more freely while paying CIS for that right.”

For some, these remarks may be enough to dismiss the CIS Linux Benchmark. Yet, given the CIS’ non-profit status and the usefulness of its Benchmark, perhaps this is one case where the usual philosophical objections need not apply. Running the Benchmark is not only far cheaper than hiring a consultant, but also educational as well.

For example, at first, I was surprised to see that installations were also docked points for having no cron.allow or at.allow files for the two system schedulers. However, after some thought, I realized that this scoring followed a basic security principle: Rather than having a service wide-open and then blocking individuals, it is easier and therefore sounder to maintain a list of those who are allowed to use them. Having seen the application of the security principle once, I now know how to apply it in other cases.

Besides a deeper knowledge of Linux security, I came away from the CIS Benchmark with
three conclusions:

  • The names for default choices in installation programs have little to do with security. They seem to refer only to the packages installed.
  • You can significantly improve security on Linux systems without interfering with normal use. Newer users might object to not being able to mount and unmount removable media, yet the fact that thousands of longtime Linux users survive without this convenience suggest that it is a minimal inconvenience once you are used to it.
  • Just because a CD image contains “Linux” in its name is no guarantee of the best possible security. While the installs I did weren’t sending out invitations to crackers, they were still far less secure than they had to be. A number of settings, such as those for password duration and length, could be improved with next to no negative effect on users. At the most, they would require occasional efforts, such as changing a personal password every three months.

I’d suspected all these conclusions before. However, using the CIS Benchmark gave me my first concrete confirmations of them. That certainty alone makes the time I spent using the CIS Benchmark time well-spent.

Category:

  • Security