August 22, 2005

CLI Magic: Make time for crontab

Author: JT Smith

Last week we looked at the at command, which will run a set of commands once and once only. For more complex regular scheduling, get to know your crontab file.

Why use a file to manage periodic jobs when a simple command can do about the same job? Well, suppose you create a script to submit at commands periodically. For example, if you have a file myscript that runs a series of commands and includes as a final command at now + 1 week &lt myscript, then the script will automatically resubmit itself once a week. Let's say the script carries out a backup every Sunday at 16:00. What if one weekend the server goes down, and doesn't come back up until Monday at 10:00? As soon as the server comes back up it will fire off all at jobs that have passed their activation times. When our script re-submits itself it will do it at 10:00 each Monday -- not what we want. That is why we will now look at regular scheduling using crontab.

Like most Linux commands, there are any number of explanations for the name "crontab" -- the one I prefer is c(h)ronological table. To begin using it, type crontab -e to invoke a text editor on the file. By default crontab -e uses vi. To change the default editor to, say, pico, add to your .profile the line:
export VISUAL=pico

The key to crontab is to remember the table structure:

  • Minute
  • Hour
  • Day of the month
  • Month
  • Day of the week
  • Command

So, for instance, the line
0 2 * * Sunday tar -cvf /mnt/dvd/backup$(date +"%Y%m%d").tar ~/*
will create a backup of your home directory every Sunday at 02:00. Literally it means "carry out the command on the hour at 2 a.m. every Sunday." Similarly,
0 0 1 january * echo "Happy New Year" | mail -s Greeting author@markbain-writer.co.uk
sends an email every January 1 at midnight.

We can do a bit more with times. The following example emails a schedule file at 7:30 a.m. every weekday. Notice the use of a dash to denote a range:
30 7 * * mon-fri echo "Your schedule" | mail -s "Your Schedule: "$(date +"%d-%b-%Y") -a schedule.txt author@markbain-writer.co.uk

As well as ranges you can use time divisions. To get crontab to run a task every 10 minutes you could type 0,10,20,30,40,50 in the minutes field, or you can use the much neater form of */10. You can mix the forms, so to run a process hourly during the day and every two hours during the night you would type:
0 9-17,18-8/2 * * * myprocess

As with at, you can limit who can use crontab, but in a slightly differently way. If /etc/cron.allow exists then only the users listed there can use crontab. If it doesn't exist but /etc/cron.deny does, then everyone except the users listed there may use crontab. If neither file exists then everyone may set schedules.

Viewing crontab jobs

You can view jobs in two ways: either use the crontab -e command, or, if you don't want to go into the editor, use crontab -l. Both will show the details that you have typed into the crontab file, making it easy to see what is due to happen and when. As with at you can see only your own jobs unless you log on as root.

Here's a tip: Don't remove a listing from crontab unless you are really, really sure that you no longer need it. Instead, put a hash (#) at the start of the line to disable that particular entry.

Let's go back to the example we started with. If a server goes down and is then brought back up, crontab scheduling just continues from that point. This means that if you had a job scheduled at 8:00 a.m. on a Monday but the server doesn't come online until 09:00, that slot will be missed and your job won't run this time around. However, it will run the following Monday at 08:00 (assuming that the server is up at that time, of course).

With at and crontab you can start automating regular tasks. No longer do you have to be up at 4:00 a.m. to do that backup; your database can be updated during the night; everyone's schedule can be emailed to them before anyone is even in the office. And all of this while you sit back and dream of how else you can use these simple and effective tools.

Click Here!