python using multiprocessing pool.map for distributed computing -
The next code works to execute "someprogram" in parallel to 50 inputfilenames using a pool of 5 workers . How 5 workers can be used on this computer, and on 7 other computers, using only standard libraries such as multiprocessing. Ideally I have a list of Tupless (hostname, rankformers), which can be used to make some quicker, and maybe change it to a decorator so that it can be used more easily on methods like "commandstall" Could. (Linux and Python 2.7) Import Multi Processing Import Subprocess Def Commands (input filename): P = subprosec POPE ("Some Programs" + InputFileName, Shell = True) p.wait () inputfilenames = ["some" + str (i) .zfill (2) + "into" in category (50)] p = multiprocessing PUL (5) PMAP (Commandlinestuff, inputfilename) It seems that you Are trying to invent that which has been written in the pure dragon itself, but it is not currently part of standard libraries. Actually you need a server running on a ...