天道酬勤,学无止境

multiprocessing

python multiprocessing + peewee + postgresql fails with SSL error

I am trying to write a Python model which is capable of doing some processing in a PostgreSQL database using the multi-threading module and peewee. In single core mode the code works, however, when I try to run the code with multiple cores I am running into a SSL error. I would like to post the structure of my model in the hope that somebody can advice how to set of my model in a proper way. Currently, I have chosen to use an object oriented approach in which I make one connection which is shared in a pool. To clarify what I have done, I will now show the source code I have so far I have three

2021-06-15 12:58:04    分类:问答    python   python-3.x   postgresql   multiprocessing   peewee

MultiProcessing Pipe recv blocks even when child process is defunct

Reading several questions on this topic I understand now that the child process inherits the file descriptors from the parent process. Which will make it more difficult for a child to receive an EOFError when a parent closes the connection. But my situation is the other way around, and I do not understand the problem I am facing. I have a parent process that starts a child process, and gives it access to one end of the Pipe connection I created. Now when the child process is done, malfunctions or whatever, everything is stopped and the connection is closed. At this point the child process

2021-06-15 10:28:41    分类:问答    python   multiprocessing   pipe   eoferror

Calling multiple instances of python scripts in matlab using java.lang.Runtime.getRuntime not working

I am running Matlab2017 on windows 10. I call a python script that runs some Speech Recognition task on cloud with something like this: userAuthCode=1;% authentication code for user account to be run on cloud cmd = ['C:\Python27\python.exe runASR.py userAuthCode]; system(cmd); When the above command is called, the python script runs the input audio file on the ASR cloud engine, and as it runs, I can see Speech Recognition scores for the audio file from Python in the Matlab console. I want to do the following: (1) Execute multiple such commands in parallel. Lets say, I have 2 input audio files

2021-06-15 07:41:58    分类:问答    java   python   matlab   multiprocessing

how to set the priority to get the mutex in C/c++

I have 3 process (equal priority) P1 P2 P3(timer) priority to get the mutex is as follows: P1(1 priority), P2(2 priority), P3(timer)(3 priority) If suppose p3 comes and get the mutex then p2 comes and wait for mutex after that p1 comes and it also wait for mutex if p3 release mutex then p1 should get the mutex not p2. How to perform this in C or C++. Note : all processes are running inside threads having same priority. OS - windows Xp

2021-06-15 07:41:14    分类:问答    c++   c   windows   mutex   multiprocessing

Python pool map and choosing number of processes

In setting the number of processes, I'd be keen to see how many threads I can actually use on my machine - how do I find this? Is there a way to determine the number of threads available to me?

2021-06-15 03:41:52    分类:问答    python   map   multiprocessing   reduce

Multiprocessing slower than serial processing in Windows (but not in Linux)

I'm trying to parallelize a for loop to speed-up my code, since the loop processing operations are all independent. Following online tutorials, it seems the standard multiprocessing library in Python is a good start, and I've got this working for basic examples. However, for my actual use case, I find that parallel processing (using a dual core machine) is actually a little (<5%) slower, when run on Windows. Running the same code on Linux, however, results in a parallel processing speed-up of ~25%, compared to serial execution. From the docs, I believe this may relate to Window's lack of fork(

2021-06-15 03:36:07    分类:问答    python   multithreading   parallel-processing   multiprocessing   python-multiprocessing

Python threading vs. multiprocessing in Linux

Based on this question I assumed that creating new process should be almost as fast as creating new thread in Linux. However, little test showed very different result. Here's my code: from multiprocessing import Process, Pool from threading import Thread times = 1000 def inc(a): b = 1 return a + b def processes(): for i in xrange(times): p = Process(target=inc, args=(i, )) p.start() p.join() def threads(): for i in xrange(times): t = Thread(target=inc, args=(i, )) t.start() t.join() Tests: >>> timeit processes() 1 loops, best of 3: 3.8 s per loop >>> timeit threads() 10 loops, best of 3: 98.6

2021-06-14 22:28:12    分类:问答    python   linux   multithreading   multiprocessing

Python Distributed Computing (works)

I'm using an old thread to post new code which attempts to solve the same problem. What constitutes a secure pickle? this? sock.py from socket import socket from socket import AF_INET from socket import SOCK_STREAM from socket import gethostbyname from socket import gethostname class SocketServer: def __init__(self, port): self.sock = socket(AF_INET, SOCK_STREAM) self.port = port def listen(self, data): self.sock.bind(("127.0.0.1", self.port)) self.sock.listen(len(data)) while data: s = self.sock.accept()[0] siz, dat = data.pop() s.send(siz) s.send(dat) s.close() class Socket: def __init__

2021-06-14 21:25:48    分类:问答    python   sockets   multiprocessing   pickle   distributed-computing

Python Shared Memory Array, no attribute get_obj()

I am working on manipulating numpy arrays using the multiprocessing module and am running into an issue trying out some of the code I have run across here. Specifically, I am creating a ctypes array from a numpy array and then trying to return the ctypes array to a numpy array. Here is the code: shared_arr = multiprocessing.RawArray(_numpy_to_ctypes[array.dtype.type],array.size) I do not need any kind of synchronization lock, so I am using RawArray. The ctypes data type is pulled from a dictionary based on the dtype of the input array. That is working wonderfully. shared_arr = numpy.ctypeslib

2021-06-14 18:58:20    分类:问答    python   numpy   multiprocessing   ctypes

Python multiprocessing doesn't play nicely with uuid.uuid4()

I'm trying to generate a uuid for a filename, and I'm also using the multiprocessing module. Unpleasantly, all of my uuids end up exactly the same. Here is a small example: import multiprocessing import uuid def get_uuid( a ): ## Doesn't help to cycle through a bunch. #for i in xrange(10): uuid.uuid4() ## Doesn't help to reload the module. #reload( uuid ) ## Doesn't help to load it at the last minute. ## (I simultaneously comment out the module-level import). #import uuid ## uuid1() does work, but it differs only in the first 8 characters and includes identifying information about the computer

2021-06-14 18:16:57    分类:问答    python   time   random   multiprocessing   uuid