How can I call an external program with a python script and retrieve the output and return code?
-
2There are some existing questions and answers on SO that will help you: stackoverflow.com/questions/89228/…Jarret Hardie– Jarret Hardie2009-04-01 19:27:37 +00:00Commented Apr 1, 2009 at 19:27
-
Does this answer your question? How to execute a program or call a system command?robertspierre– robertspierre2021-11-20 10:33:29 +00:00Commented Nov 20, 2021 at 10:33
-
Can anyone tell me how to store the output to a log.csv file (I need this for git, so the csv file is git commit history)vpp– vpp2022-03-17 16:50:49 +00:00Commented Mar 17, 2022 at 16:50
6 Answers
Look at the subprocess module: a simple example follows...
from subprocess import Popen, PIPE
process = Popen(["ls", "-la", "."], stdout=PIPE)
(output, err) = process.communicate()
exit_code = process.wait()
5 Comments
subprocess.run() (Python >= 3.5 is required).process = Popen(["ls", "-la", "."], stdout=PIPE, stderr=PIPE)Following Ambroz Bizjak's previous comment, here is a solution that worked for me:
import shlex
from subprocess import Popen, PIPE
cmd = "..."
process = Popen(shlex.split(cmd), stdout=PIPE)
process.communicate()
exit_code = process.wait()
2 Comments
After some research, I have the following code which works very well for me. It basically prints both stdout and stderr in real time. Hope it helps someone else who needs it.
stdout_result = 1
stderr_result = 1
def stdout_thread(pipe):
global stdout_result
while True:
out = pipe.stdout.read(1)
stdout_result = pipe.poll()
if out == '' and stdout_result is not None:
break
if out != '':
sys.stdout.write(out)
sys.stdout.flush()
def stderr_thread(pipe):
global stderr_result
while True:
err = pipe.stderr.read(1)
stderr_result = pipe.poll()
if err == '' and stderr_result is not None:
break
if err != '':
sys.stdout.write(err)
sys.stdout.flush()
def exec_command(command, cwd=None):
if cwd is not None:
print '[' + ' '.join(command) + '] in ' + cwd
else:
print '[' + ' '.join(command) + ']'
p = subprocess.Popen(
command, stdout=subprocess.PIPE, stderr=subprocess.PIPE, cwd=cwd
)
out_thread = threading.Thread(name='stdout_thread', target=stdout_thread, args=(p,))
err_thread = threading.Thread(name='stderr_thread', target=stderr_thread, args=(p,))
err_thread.start()
out_thread.start()
out_thread.join()
err_thread.join()
return stdout_result + stderr_result
1 Comment
out is of type bytes so it cannot be used in the write method. Also, it prints the characters, but never stops.Check out the subprocess module here: http://docs.python.org/library/subprocess.html#module-subprocess. It should get what you need done.
Comments
I've developed a little library (py-execute) that allows you to execute external programs, retrieve the output and the retcode and, at the same time get output in console in real time:
>>> from py_execute.process_executor import execute
>>> ret = execute('echo "Hello"')
Hello
>>> ret
(0, 'Hello\n')
You can avoid printing to console passing a mock user_io:
>>> from mock import Mock
>>> execute('echo "Hello"', ui=Mock())
(0, 'Hello\n')
I wrote it because with plain Popen (In Python 2.7) I was having trouble executing commands with a long output
Comments
I know this is a pretty old thread, but I wanted to add something I found while researching this since I stumbled on this and it helped me quite a bit.
In subprocess, if you don't need total control over PIPE you can use
the .run() function.
This is my current code that works rather nicely for my own program.
process = subprocess.run(["./program","cm1","cm2"], text=True)
print(process)
And this outputs the stdout live. This can also be configured to output both stdout and stderr in the same stream.