PPO-LunarLander-v2 / lunar_agent /_stable_baselines3_version
castejon777's picture
1st try of training PPO in LunarLander
e6dc5e9
1.7.0