1

My goal is create one main python script that executes multiple independent python scripts in windows server 2012 at the same time. One of the benefits in my mind is that I can point taskscheduler to one main.py script as opposed to multiple .py scripts. My server has 1 cpu. I have read on multiprocessing,thread & subprocess which only added to my confusion a bit. I am basically running multiple trading scripts for different stock symbols all at the same time after market open at 9:30 EST. Following is my attempt but I have no idea whether this is right. Any direction/feedback is highly appreciated!

import subprocess

subprocess.Popen(["python", '1.py'])
subprocess.Popen(["python", '2.py'])
subprocess.Popen(["python", '3.py'])
subprocess.Popen(["python", '4.py'])
9
  • try using queuing , for example sqs Commented Nov 28, 2017 at 18:52
  • 2
    I love Python -- but this may be one of those times when bash would be a better tool for you: stackoverflow.com/questions/28549641/… Commented Nov 28, 2017 at 18:53
  • Are the python scripts related in some way or completely unrelated? If yes, what's the relation? Commented Nov 28, 2017 at 18:55
  • 2
    Bash? On Windows Server 2012? Commented Nov 28, 2017 at 18:55
  • 1
    @SteveJ bash -> batch. pretty much the exact same can be accomplished with START "" "path-to-python" "path-to-script" Commented Nov 28, 2017 at 19:24

3 Answers 3

3

I think I'd try to do this like that:

from multiprocessing import Pool

def do_stuff_with_stock_symbol(symbol):
    return _call_api()

if __name__ == '__main__':
    symbols = ["GOOG", "APPL", "TSLA"]
    p = Pool(len(symbols))
    results = p.map(do_stuff_with_stock_symbol, symbols)
    print(results)

(Modified example from multiprocessing introduction: https://docs.python.org/3/library/multiprocessing.html#introduction)

Consider using a constant pool size if you deal with a lot of stock symbols, because every python process will use some amount of memory.

Also, please note that using threads might be a lot better if you are dealing with an I/O bound workload (calling an API, writing and reading from disk). Processes really become necessary with python when dealing with compute bound workloads (because of the global interpreter lock).

An example using threads and the concurrent futures library would be:

import concurrent.futures

TIMEOUT = 60

def do_stuff_with_stock_symbol(symbol):
    return _call_api()

if __name__ == '__main__':
    symbols = ["GOOG", "APPL", "TSLA"]

    with concurrent.futures.ThreadPoolExecutor(max_workers=len(symbols)) as executor:
        results = {executor.submit(do_stuff_with_stock_symbol, symbol, TIMEOUT): symbol for symbol in symbols}

        for future in concurrent.futures.as_completed(results):
            symbol = results[future]
            try:
                data = future.result()
            except Exception as exc:
                print('{} generated an exception: {}'.format(symbol, exc))
            else:
                print('stock symbol: {}, result: {}'.format(symbol, data))

(Modified example from: https://docs.python.org/3/library/concurrent.futures.html#threadpoolexecutor-example)

Note that threads will still use some memory, but less than processes.

You could use asyncio or green threads if you want to reduce memory consumption per stock symbol to a minimum, but at some point you will run into network bandwidth problems because of all the concurrent API calls :)

Sign up to request clarification or add additional context in comments.

9 Comments

Wow, this might be a better approach than the solution I imagined. So do I put all the regular code after the return call within the do_stuff_with_stock_symbol(symbol) method? Any idea how to perform the exact thing with threading as I have only 1GB ram in my server.
instead of return _call_api() you can do (probably almost) anything you want. I added a threaded example.
Thank you for the answer! Do you recommend the threaded example given you have a good idea of what I am trying to do which is run multiple trading scripts for different symbols all at the same time after market open at 9:30 EST?
Not sure if I have enough info to recommend anything, but for I/O bound workloads threads and asyncio make more sense than processes IMO. If this helped you, please mark my answer as accepted :)
@gibbz00 In other words: If you are not trying to call the API AND compute thousands of prime numbers, use threads or asyncio! :)
|
2

While what you're asking might not be the best way to handle what you're doing, I've wanted to do similar things in the past and it took a while to find what I needed so to answer your question:

I'm not promising this to be the "best" way to do it, but it worked in my use case.

I created a class I wanted to use to extend threading.

thread.py

"""
Extends threading.Thread giving access to a Thread object which will accept
A thread_id, thread name, and a function at the time of instantiation. The
function will be called when the threads start() method is called.
"""

import threading


class Thread(threading.Thread):
    def __init__(self, thread_id, name, func):
        threading.Thread.__init__(self)
        self.threadID = thread_id
        self.name = name

        # the function that should be run in the thread.
        self.func = func

    def run(self):
        return self.func()

I needed some work done that was part of another package

work_module.py

import...

def func_that_does_work():
    # do some work
    pass

def more_work():
    # do some work
    pass

Then the main script I wanted to run main.py

from thread import Thread
import work_module as wm


mythreads = []
mythreads.append(Thread(1, "a_name", wm.func_that_does_work))
mythreads.append(Thread(2, "another_name", wm.more_work))

for t in mythreads:
    t.start()

The threads die when the run() is returned. Being this extends a Thread from threading there are several options available in the docs here: https://docs.python.org/3/library/threading.html

Comments

1

If all you're looking to do is automate the startup, creating a .bat file is a great and simple alternative to trying to do it with another python script.

the example linked in the comments shows how to do it with bash on unix based machines, but batch files can do a very similar thing with the START command:

start_py.bat:

START "" /B "path\to\python.exe" "path\to\script_1.py"
START "" /B "path\to\python.exe" "path\to\script_2.py"
START "" /B "path\to\python.exe" "path\to\script_3.py"

the full syntax for START can be found here.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.