ppo-MicrortsMining-v1 / saved_models
sgoodfriend's picture
PPO playing MicrortsMining-v1 from https://github.com/sgoodfriend/rl-algo-impls/tree/1d9c411313d64f925d31e3c0b9bc132fa170a3cc
c18d83f