BSK_RL: Issue with Serialization of Custom Environment When Using RLlib APPO with Multiple Workers #615
-
When attempting to use custom environments based on GeneralSatelliteTasking class with Ray/RLlib, I encounter a TypeError: cannot pickle 'SwigPyObject' object during the environment's initialization phase. This issue arises when I attempt to build an APPO algorithm trainer with a configuration that includes the GeneralSatelliteTasking environment (with both Gym and RLlib) and uses >0 workers. I managed to train several agents with 0 workers (RLlib creates num_workers + 1 copies of the environment) and several environments per worker, but when using a number greater than 0 workers, config.build(), which calls the deepcopy function (which internally uses pickle function), gives the error. The custom environment GeneralSatelliteTasking relies on external libraries that use SWIG for Python bindings, so I guess it would be better to change the env_config to serializable objects, but I would really like to use these ones since they provide with more flexibility and automatised parameters. Any help or references to documentation/examples that tackle similar challenges would be greatly appreciated. Thanks in advance!
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
Please post questions on BSK_RL on the BSK_RL repo forum. |
Beta Was this translation helpful? Give feedback.
Just enabled discussions over on the bsk_rl repo, I'll copy the question over there! I may have some solutions.
AVSLab/bsk_rl#123