python using multiprocessing pool.map for distributed computing -


The next code works to execute "someprogram" in parallel to 50 inputfilenames using a pool of 5 workers . How 5 workers can be used on this computer, and on 7 other computers, using only standard libraries such as multiprocessing. Ideally I have a list of Tupless (hostname, rankformers), which can be used to make some quicker, and maybe change it to a decorator so that it can be used more easily on methods like "commandstall" Could. (Linux and Python 2.7)

  Import Multi Processing Import Subprocess Def Commands (input filename): P = subprosec POPE ("Some Programs" + InputFileName, Shell = True) p.wait () inputfilenames = ["some" + str (i) .zfill (2) + "into" in category (50)] p = multiprocessing PUL (5) PMAP (Commandlinestuff, inputfilename)  

It seems that you Are trying to invent that which has been written in the pure dragon itself, but it is not currently part of standard libraries.

Actually you need a server running on a remote machine (s) that receives a spicy object (and possibly some data) to execute a connection that accepts a connection, It executes and postes the result back. You will also need a client on the local machine that posts, collects results and potentially balances some loads.

Gives a long list of tools in the parallel processing entry, which with various advantages and disadvantages to perform this type of work.


Comments

Popular posts from this blog

import - Python ImportError: No module named wmi -

Editing Python Class in Shell and SQLAlchemy -

c# - MySQL Parameterized Select Query joining tables issue -