Bash disaster prevention

From assela Pathirana

Jump to: navigation, search

Like any other versatile tool, consequences of misusing of UNIX scripting can be disasterous. Careful consideration of possible pitfalls in scripting is a very much necessary art. However, it is equally important to have some 'safety-value' in case something goes terribly wrong.

A horror story

You listen to music and have built up a quite extensive library of digital music in your computer under the directory /home/alex/music. Everyday, you copy a folder from under music folder (e.g. beatles/yellowsubmarine) to your MP3 player, which appears as a usb drive (say /mnt/usbstick). You write a small script called loadmusic.bash and save it in /home/alex/,

cd /mnt/usbstick/todaysmusic # Go to the appropriate folder
rm -rf #Delete all the old files 
cp -r /home/alex/music/$1 . 
echo 'Done!'

and call it as

$ loadmusic.bash yellowsubmarine

On one day, you simply forget to insert the usb key, before running the command:

loadmusic.bash : line 2: cd: /mnt/usbstick/todaysmusic: No such file or directory

What happens here. Bash tries to change directory to /mnt/usbstick/todaysmusic , but can't. Does it stop there? Not unless we ask it to. It simply forgets what happened and executes the next command, namely rm -rf and innocently deletes ALL your digital music albums! (And we don't waste time on useless stuff like backups!!)

Error handling

Never, ever use dangerous commands like rm -rf (or rm by itself, for that matter,) or mv in a script that doesn't have a proper error handler.

Whenever there is the potential for pitfalls like the above, use some form of general error handler to at least tell the script to exit without trying the rest of the commands.

## End of error handling
function handle {
   echo "Error"
   exit 1
trap handle ERR
## End of error handling
cd /mnt/usbstick/todaysmusic # Go to the appropriate folder
rm -rf #Delete all the old files 
cp -r /home/alex/music/$1 . 
echo 'Done!'

which will execute the function handle() as soon as it encounters and error. And inside handle() , we have exit , so there's no danger of bash trying to be smart.

loadmusic.bash: line 7: cd: /mnt/usbstick/todaysmusic: No such file or directory

More useful error handling

trap type of error handling is useful for preventing disasters. For more graceful error handling, the return value of each individual command can be checked.

Every well behaved Unix command has a exit status expressed by an integer value between 0 and 255. 0 indicates success. Other codes are less universal. In many systems, 127 indicates "Command not found" error. 126 - "permission denied" and 130 - "Process interrupted" (e.g. Ctrl+C).

Look at the following code snippet:

date=`date +%Y%m%d`
wget $site/$file -nv
if [ $? -ne 0 ]; then
  echo "Some error in retrieving $file."
  exit 1
echo "Now processing $file"
./process.bash $file

It tries to download a file from a ftp site. Then the code check the exit status of wget, which should be zero only if everything went all right. Otherwise, it will be non-zero and the script print a meaningful error message and exit.

Personal tools