Python multiprocessing and queue
By Pierre-Yves on Thursday, November 27 2014, 08:28 - Général - Permalink
Every once in a while I want to run a program in parallel but gather its output in a single process so that I do not have concurrent accesses (think for example, several process computing something and storing the output in a file or in a database). I could use locks for this but I figure I could also use a queue.
My problem is that I always forget how I do it and always need to search for it when I want to do it again :-) So for you as much as for me here is an example:
# -*- coding: utf-8 -*- import itertools from multiprocessing import Pool, Manager def do_something(arg): """ This function does something important in parallel but where we want to centralize the output, thus using the queue """ data, myq = arg print data myq.put(data) myq.task_done() data = range(100) m = Manager() q = m.Queue() p = Pool(5) p.map(do_something, itertools.product(data, [q])) with open('output', 'w') as stream: while q.qsize(): print q.qsize() item = q.get() print item stream.write('%s\n' % item) q.join()
There are probably other/better ways to do this but that's a start :-)