1

I have a python project which consist of 3 Python scripts that have run indefinitely, and I would like to run them simultaneously. They are:

  • web.py: Is a Flask application which will be running forever as the control panel of the project.
  • pir.py: Is a script that whatchess the motion sensor to go HIGH, does some stuff, and then keeps looping till the sensor goes HIGH again.
  • keypadd.py: Is in a loop till the keypad goes HIGH, reads passcode combinations, does some stuff and loops again.

I tried to run those three files with sh in a shell script. But once the web.py runs, it seems like the other Python files are waiting in a queue untill web.py finish the job and then are going to be executed. But the web.py will never finish the job, because it keeps looping. I would like the scripts to be executed in this order: web.py, pir.py and then keypadd.py.

How can I do that?

Shell script I used:

#!/bin/sh
# launcher.sh


cd /home/pi/Ptoject
sudo python web.py
sudo python pir.py
sudo python keypadd.py

3 Answers 3

3

Another options would be to use a Python wrapper script instead of a shell script, import the functions you need, and run them in other processes:

import time
import mutliprocessing

import web
import pir
import keypadd

processes = []

for func in [web.function, pir.function, keypadd.function]:
    processes.append(multiprocessing.Process(target=func, args=(arg1, arg2)))
    processes[-1].start()

# Do stuff

while True:
    time.sleep(600) # sleep for 10 minutes
    living_processes = [p.is_alive() for p in processes]
    if living_processes < 3:
        for p in living_processes:
            p.terminate()

        print("Oops: Some processes died")
        # do other error handling here if necessary

There are some nice things about this approach:

  • It is all in Python, which I think is nice when the rest of my project is Python.
  • You can also add some logic in the "do stuff" area that monitors your processes and kills the others if one fails with an exception, something similar to what I've done in the example.
  • Better yet, you could create a multiprocessing.Queue() and if one process fails, send a "poison pill" to the rest to have them exit cleanly instead of terminating them. You'll need to add some logic to the infinitely looping routines to make them check the queue every once in a while to see if they should exit, and join them nicely after you've dropped the pill.
  • Stopping everything with this approach is easier than stopping background processes. A single Ctrl-C in the main script will cause a KeyboardInterrupt to be propagated to the child processes, bringing everything to a halt at the same time.
Sign up to request clarification or add additional context in comments.

1 Comment

Thats a really nice idea to be honest. I'll stick for now to the shell script, till I finish with the rest project. And I'll come back to it again. Thanks mate.
0
#!/bin/sh
# launcher.sh
cd /home/pi/Ptoject
sudo python web.py & sudo python pir.py & sudo python keypadd.py

Another solution is use supervisord and daemonize your files.

Comments

0

Add an ampersand to fork the process-

cd /home/pi/Ptoject
sudo python web.py &
sudo python pir.py &
sudo python keypadd.py

1 Comment

Thanks it works! And how can I stop the process? cause I can see ctr + C it doesn't work.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.