Python process pool: multiprocessing.pool

This article is transferred to http://www.cnblogs.com/kaituorensheng/p/4465768.html, and some minor changes have been made on the basis of it.

When using Python for system management, especially when operating multiple file directories at the same time, or remotely controlling multiple hosts, parallel operations can save a lot of time. When the number of manipulated objects is not large, you can directly use the Process in multiprocessing to dynamically generate multiple processes. A dozen is fine, but if there are hundreds or thousands of targets, manually limiting the number of processes is too much. Too tedious, at this time can play the role of process pool.
Pool can provide a specified number of processes for users to call. When a new request is submitted to the pool, if the pool is not full, then a new process will be created to execute the request; but if the number of processes in the pool is already When the specified maximum value is reached, the request will wait until a process in the pool ends before creating a new process to it.

Example 1: Use process pool

from multiprocessing import freeze_support,Pool
import time

def Foo(i):
 time.sleep(2)print('___time---',time.ctime())return i+100

def Bar(arg):print('----exec done:',arg,time.ctime())if __name__ =='__main__':freeze_support()
 pool =Pool(3) #The number of concurrently executing processes in the thread pool is 3for i inrange(4):
  pool.apply_async(func=Foo,args=(i,),callback=Bar) #The number of concurrently executing processes in the thread pool is 3. When a process is executed, if there is a new process waiting to be executed, it will be added
  # pool.apply(func=Foo,args=(i,))print('end')
 pool.close()
 pool.join()#Before calling join, call the close function first, otherwise an error will occur. No new process will be added to the pool after the close is executed,The join function waits for all child processes to end

Results of the:

end
___ time--- Thu Jun 1615:11:452016----exec done:100 Thu Jun 1615:11:452016
___ time--- Thu Jun 1615:11:452016----exec done:101 Thu Jun 1615:11:452016
___ time--- Thu Jun 1615:11:452016----exec done:102 Thu Jun 1615:11:452016
___ time--- Thu Jun 1615:11:472016----exec done:103 Thu Jun 1615:11:472016

Function explanation:

Execution instructions: Create a process pool pool and set the number of processes to 3. xrange(4) will generate four objects [0, 1, 2, 4] successively, and the four objects will be submitted to the pool. The pool specifies the number of processes to be 3, so 0, 1, and 2 will be directly sent to the process for execution. When one of them is executed, a process processing object 3 will be vacated, so the output "msg: hello 3" will appear in the "end" "Rear. Because it is non-blocking, the main function will execute itself and ignore the execution of the process, so after running the for loop, it will directly output "mMsg: hark~ Mark~ Mark~~~~~~~~~~~~~ ~~~~~~~~", the main program waits for the end of each process at pool.join().

Example 2: Use process pool (blocking)

from multiprocessing import freeze_support,Pool
import time

def Foo(i):
 time.sleep(2)print('___time---',time.ctime())return i+100

def Bar(arg):print('----exec done:',arg,time.ctime())if __name__ =='__main__':freeze_support()
 pool =Pool(3) #The number of concurrently executing processes in the thread pool is 3for i inrange(4):
  pool.apply(func=Foo,args=(i,))print('end')
 pool.close()
 pool.join()#Before calling join, call the close function first, otherwise an error will occur. No new process will be added to the pool after the close is executed,The join function waits for all child processes to end

Results of the

___ time--- Thu Jun 1615:15:162016
___ time--- Thu Jun 1615:15:182016
___ time--- Thu Jun 1615:15:202016
___ time--- Thu Jun 1615:15:222016
end

Example 3: Use the process pool and pay attention to the results

import multiprocessing
import time

def func(msg):print('hello :',msg,time.ctime())
 time.sleep(2)print('end',time.ctime())return'done'+ msg

if __name__=='__main__':
 pool = multiprocessing.Pool(2)
 result =[]for i inrange(3):
  msg ='hello %s'%i
  result.append(pool.apply_async(func=func,args=(msg,)))

 pool.close()
 pool.join()for res in result:print('***:',res.get())print('AAAAAAAAll end--')

Results of the


Note: The get() function gets the value of each returned result

Example 4: Using multiple process pools

import multiprocessing
import time,os,random

def Lee():print('\nRun task Lee--%s******ppid:%s'%(os.getpid(),os.getppid()),'~~~~',time.ctime())
 start = time.time()
 time.sleep(random.randrange(10))
 end = time.time()print('Task Lee,runs %0.2f seconds.'%(end-start),'~~~~',time.ctime())

def Marlon():print("\nRun task Marlon-%s******ppid:%s"%(os.getpid(),os.getppid()),'~~~~',time.ctime())
 start = time.time()
 time.sleep(random.random()*40)
 end=time.time()print('Task Marlon runs %0.2f seconds.'%(end - start),'~~~~',time.ctime())

def Allen():print("\nRun task Allen-%s******ppid:%s"%(os.getpid(),os.getppid()),'~~~~',time.ctime())
 start = time.time()
 time.sleep(random.random()*30)
 end = time.time()print('Task Allen runs %0.2f seconds.'%(end - start),'~~~~',time.ctime())

def Frank():print("\nRun task Frank-%s******ppid:%s"%(os.getpid(),os.getppid()),'~~~~',time.ctime())
 start = time.time()
 time.sleep(random.random()*20)
 end = time.time()print('Task Frank runs %0.2f seconds.'%(end - start),'~~~~',time.ctime())if __name__ =='__main__':
 func_list =[Lee,Marlon,Allen,Frank]print('parent process id %s'%os.getpid())

 pool = multiprocessing.Pool(4)for func in func_list:
  pool.apply_async(func)  #Pool execution function, apply execution function,When a process is executed, a new process will be added to the pool

 print('Waiting for all subprocesses done...')
 pool.close()
 pool.join()    #Before calling join, you must first call close()Function, otherwise an error will occur,close()No new processes will be added to the pool after execution,The join function waits for the end of the child process
 print('All subprocesses done.')

Results of the

parent process id 98552
Waiting for all subprocesses done...

Run task Lee--97316******ppid:98552~~~~ Thu Jun 1615:20:502016

Run task Marlon-95536******ppid:98552~~~~ Thu Jun 1615:20:502016

Run task Allen-95720******ppid:98552~~~~ Thu Jun 1615:20:502016

Run task Frank-98784******ppid:98552~~~~ Thu Jun 1615:20:502016
Task Allen runs 0.31 seconds.~~~~ Thu Jun 1615:20:512016
Task Lee,runs 7.00 seconds.~~~~ Thu Jun 1615:20:572016
Task Frank runs 14.48 seconds.~~~~ Thu Jun 1615:21:052016
Task Marlon runs 31.72 seconds.~~~~ Thu Jun 1615:21:222016
All subprocesses done.

multiprocessing pool map

# coding: utf-8import multiprocessing 

def m1(x): 
 print x * x 

if __name__ =='__main__': 
 pool = multiprocessing.Pool(multiprocessing.cpu_count()) 
 i_list =range(8)
 pool.map(m1, i_list)

One execution result

0
1
4
9
16
25
36
49

Reference: http://www.dotblogs.com.tw/rickyteng/archive/2012/02/20/69635.aspx

Question: http://bbs.chinaunix.net/thread-4111379-1-1.html

# coding: utf-8import multiprocessing
import logging

def create_logger(i):
 print i

classCreateLogger(object):
 def __init__(self, func):
  self.func = func

if __name__ =='__main__':
 ilist =range(10)

 cl =CreateLogger(create_logger)
 pool = multiprocessing.Pool(multiprocessing.cpu_count())
 pool.map(cl.func, ilist)

 print "hello------------>"

One execution result

0
1
2
3
4
5
6
7
8
9
hello------------>

Recommended Posts

Python process pool: multiprocessing.pool
Python uses Matlab command process analysis
How to use python thread pool
Pycharm modify python path process diagram
Python file operation basic process analysis
Python process and thread summary case analysis
How to process excel table with python
Python replacement pip source method process analysis
Python process and thread summary case analysis