Use multiprocessing sharing memory without pickling large object in Python

60
March 05, 2021, at 02:50 AM

I did some research, but still don't find an answer. Is it possible to create a sharing memory object for multiple processes without pickling the large object?

I'm trying to paralelize a task where the same very large object will be used across all processes (only for reading). I tried to use multiprocessing.Manager with custom class (from this answer), but still have the same error:

OverflowError: cannot serialize a bytes object larger than 4 GiB

I saw that lock=False turns the serialization process off, but if I want to use something like:

large_object = multiprocessing.Value("dtype", l_obj, lock=False)

I need to define a custom data type, since multiprocessing.Value accepts only native types.

Is it possible to "turn-off" a serialization process? I just need to share a complex object across all processes, that will only read it.

Thank you.

READ ALSO
Cannot read enum property, ClassCastException

Cannot read enum property, ClassCastException

For my model I have problem with reading enum propertyAfter reading from mongoDB, the property 'value' has type String instead of MyEnum

78
How to get answers in a random order with handlebars?

How to get answers in a random order with handlebars?

i created a quiz App with NodeJs, storing the questions and answers in MongoDb and using Hanbldebars but i ran into a problemThe code works fine but as you can see the correct answer to the questions appears always in the first place, making the game totally...

60
JavaScript filter array of objects based on property values

JavaScript filter array of objects based on property values

I have an array of objects in javascriptThe contents look like this;

60