Make sure only one Python script is running at a time on Linux
If you have a resource intensive process that could bring your server to its knees if more than one instance is running at a time, you'll need some programmatic way of insuring that never happens. The most "unixy" way to do this is using a process id file, which I'll describe how do do this in python below. An advantage of this method is that you don't have to rely on a process to keep running and looking for work to do. I don't like relying on processes to keep running, especially if they pull in C libraries with potential memory leeks and crashing. Essentially you launch the python script like normal, either through an "os.system" call in your python web script, or from a cron job running in short intervals. The script will look for a file called "cron.pid". If it exists it will read the process id from the file. It will then look at the process information exposed by the Linux file system and see if that process id is running the same script that I want to run. If it is then gracefully exit. Otherwise if will get the current process ID and write it to "cron.pid". Here's some example code.
#gen_cartos.py running_pid = None if os.path.isfile("cron.pid"): running_pid = int(file("cron.pid", "r").read()) if os.path.exists("/proc/%s" % running_pid): if "gen_cartos" in file("/proc/%s/cmdline" % running_pid).read(): sys.exit(0) curr_pid = os.getpid() pid_file = file("cron.pid", "w") pid_file.write("%s" % curr_pid) pid_file.flush() pid_file.close()
A small chance for a race condition exists when using this method if two instances of the script are launched at exactly the same time before either is able to write a cron.pid file. I think the odds are slim in most use cases for this pattern, and hopefully the possibility of having two instances of the script running won't wreak havoc on your system. When I use this pattern, I'm more concerned that the system keeps launching 10+ instances of a script that never finishes because the system is out of memory or other resources.
Remote python debugging for free
Sometimes you need to debug a production django install if something weird is happening that isn't happening on your development box. A simple way to do this is use winpdb. First decide where you want to break your code and insert the following line:
import rpdb2; rpdb2.start_embedded_debugger("password")
The script will wait at that point for winpdb to connect. Next open winpdb and select "Attach" from the file menu. Enter the same password and the host of the webserver and start debugging away. Also you may need to pay attention to this note from the winpdb online docs:
Firewall Ports (For remote debugging)
When using Winpdb for remote debugging make sure any firewall on the way has TCP port 51000 open. Note that if port 51000 is taken Winpdb will search for an alternative port between 51000 and 51023.