to set it up. You'll have to also run `pip install -e '.[robotics]'` if
you didn't do the full install.
``` python
import gym
env = gym.make('HandManipulateBlock-v0')
env.reset()
env.render()
```
You can also find additional details in the accompanying [technical
report](https://arxiv.org/abs/1802.09464) and [blog
post](https://blog.openai.com/ingredients-for-robotics-research/). If
you use these environments, you can cite them as follows:
@misc{1802.09464,
Author = {Matthias Plappert and Marcin Andrychowicz and Alex Ray and Bob McGrew and Bowen Baker and Glenn Powell and Jonas Schneider and Josh Tobin and Maciek Chociej and Peter Welinder and Vikash Kumar and Wojciech Zaremba},
Title = {Multi-Goal Reinforcement Learning: Challenging Robotics Environments and Request for Research},
Year = {2018},
Eprint = {arXiv:1802.09464},
}
### Toy text
Toy environments which are text-based. There's no extra dependency to
install, so to get started, you can just do:
``` python
import gym
env = gym.make('FrozenLake-v0')
env.reset()
env.render()
```
## OpenAI Environments
### Roboschool
3D physics environments like Mujoco environments but uses the Bullet physics engine and does not require a commercial license.
Learn more here: https://github.com/openai/roboschool
### Gym-Retro
Gym Retro lets you turn classic video games into Gym environments for reinforcement learning and comes with integrations for ~1000 games. It uses various emulators that support the Libretro API, making it fairly easy to add new emulators.
Learn more here: https://github.com/openai/retro
## Third Party Environments
The gym comes prepackaged with many many environments. It's this common API around many environments that makes Gym so great. Here we will list additional environments that do not come prepacked with the gym. Submit another to this list via a pull-request.
3D physics environments like the Mujoco environments but uses the Bullet physics engine and does not require a commercial license. Works on Mac/Linux/Windows.
Learn more here: https://docs.google.com/document/d/10sXEhzFRSnvFcl3XxNGhnD4N2SedqwdAvK3dsihxVUA/edit#heading=h.wz5to0x8kqmr
PGE is a FOSS 3D engine for AI simulations, and can interoperate with the Gym. Contains environments with modern 3D graphics, and uses Bullet for physics.
A human musculoskeletal model and a physics-based simulation environment where you can synthesize physically and physiologically accurate motion. One of the environments built in this framework is a competition environment for a NIPS 2017 challenge.
Learn more here: https://github.com/stanfordnmbl/osim-rl
MiniWorld is a minimalistic 3D interior environment simulator for reinforcement learning & robotics research. It can be used to simulate environments with rooms, doors, hallways and various objects (eg: office and home environments, mazes). MiniWorld can be seen as an alternative to VizDoom or DMLab. It is written 100% in Python and designed to be easily modified or extended.
Learn more here: https://github.com/maximecb/gym-miniworld
The environment consists of transportation puzzles in which the player's goal is to push all boxes on the warehouse's storage locations.
The advantage of the environment is that it generates a new random level every time it is initialized or reset, which prevents over fitting to predefined levels.
Learn more here: https://github.com/mpSchrader/gym-sokoban