Skip to content

skim0119/gym-softrobot

Repository files navigation

Soft-Robot Control Environment (gym-softrobot)

Documentation Status

The environment is designed to leverage wide-range of reinforcement learning methods into soft-robotics control. Our inspiration is from slender-body living creatures, such as octopus or snake. The code is based on PyElastica, an open-source physics simulation for highly-deformable slender structure. We intend this package to be easy-to-install and fully compatible to OpenAI Gym and other available RL algorithms.

The package is under development, in Alpha phase. Detail roadmap for Q2-2022 will be available.

Installation

pip install gym-softrobot

To test the installation, we provide few debugging scripts. The environment can be tested using the following command.

python -m gym_softrobot.debug.make     # Make environment and run 10 steps
python -m gym_softrobot.debug.registry # Print gym-softrobot environment

Requirements:

  • Python 3.8+
  • OpenAI Gym 0.21.0
  • PyElastica 0.2+
  • COMM
  • Matplotlib (optional for display rendering and plotting)
  • POVray (optional for 3D rendering)

Rendering

We support two different backends for the rendering: POVray and Matplotlib. The default is set to use POVray, but the configuration can be switched by adding following lines.

from gym_softrobot.config import RendererType
gym_softrobot.RENDERER_CONFIG = RendererType.MATPLOTLIB  # Default: POVRAY

POVray

To make a good-looking 3D videos and figures, we use POVray python wrapper Vapory. POVray is not a requirement to run the environment, but it is necessary to use env.render() function as typical gym environment.

If you would like to test POVray with gym-softrobot, use

python -m gym_softrobot.debug.render  # Render 10 frames using vapory

Matplotlib

We provide secondary rendering tool using Matplotlib for a quick debugging and sanity checking.

Reinforcement Learning Example

We tested the environment using Stable Baselines3 for centralized control. More advanced algorithms are still under development.

If you have your own algorithm that you would like to test with our environment, you are welcome to reach out to us.

Citation

Please use this bibtex to cite in your publications:

@misc{gym_softrobot,
  author = {Chia-Hsien Shih, Seung Hyun Kim, Mattia Gazzola},
  title = {Soft Robotics Environment for OpenAI Gym},
  year = {2022},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/skim0119/gym-softrobot}},
}

Environment Documentation

The description of each environment is available in documentation.

Contribution

We are currently developing the package internally. We plan to deploy the package to open-development in Q2-2022.

Author

GitHub Contributors Image