Bash script that cannot be run more than once at the same time

People dealing with cron, and bash scripts that might take a bit longer than they're supposed to often encounter the following behaviour.

Suppose that you launch something from cron in say every hour. The stuff usually completes in 10 minutes, but sometimes, when the load peaks, or network clogs, the process is running much slower. After an hour an other one is launched, further hogging the resources of the machine and possibly messing up data.

The solution for this is to pay attention not to run twice of course (and also to fix the underlying problem that causing the slowdowns).

Craig Andrews posted an almos working shell script in his blog (

I hereby shamelessly re-post the snippet. My only excuse is that I've fixed the typo causing the output of kill to be written into a file named 1. Well, I also added a few comments and hints. Anyway. Here it is:


# Replace this with a meaningful, and unique filename
# You probably need root privileges to write /var/run.
# You can use any filename and path here for testing or
# for whatever reason.
if [ -e $pidfile ]; then
pid=`cat $pidfile`
if kill -0 2>&1 > /dev/null $pid; then
echo "Already running"
exit 1
rm -f $pidfile
echo $$ > $pidfile

# Do your stuff here.
# For testing purposes, we're just gonna
# sleep for 10 seconds. Try opening two
# terminal windows and launch the script
# in both one at the same time.
sleep 10

rm $pidfile


Népszerű bejegyzések ezen a blogon

How does iptables hashlimit module work?

How to setup a raspberry PI as an IPv6 router with a SIXXS tunnel