ahGadji commited on
Commit
481a1eb
·
verified ·
1 Parent(s): b654745

My first DRFL

Browse files
README.md CHANGED
@@ -1,13 +1,12 @@
1
  ---
 
2
  tags:
3
  - LunarLander-v2
4
- - ppo
5
  - deep-reinforcement-learning
6
  - reinforcement-learning
7
- - custom-implementation
8
- - deep-rl-course
9
  model-index:
10
- - name: PPO
11
  results:
12
  - task:
13
  type: reinforcement-learning
@@ -17,45 +16,22 @@ model-index:
17
  type: LunarLander-v2
18
  metrics:
19
  - type: mean_reward
20
- value: 202.45 +/- 64.47
21
  name: mean_reward
22
  verified: false
23
  ---
24
 
25
- # PPO Agent Playing LunarLander-v2
 
 
26
 
27
- This is a trained model of a PPO agent playing LunarLander-v2.
 
28
 
29
- # Hyperparameters
30
- ```python
31
- {'exp_name': 'ppo'
32
- 'seed': 1
33
- 'torch_deterministic': True
34
- 'cuda': True
35
- 'track': False
36
- 'wandb_project_name': 'cleanRL'
37
- 'wandb_entity': None
38
- 'capture_video': False
39
- 'env_id': 'LunarLander-v2'
40
- 'total_timesteps': 100000
41
- 'learning_rate': 0.0025
42
- 'num_envs': 4
43
- 'num_steps': 128
44
- 'anneal_lr': True
45
- 'gae': True
46
- 'gamma': 0.99
47
- 'gae_lambda': 0.95
48
- 'num_minibatches': 8
49
- 'update_epochs': 4
50
- 'norm_adv': True
51
- 'clip_coef': 0.2
52
- 'clip_vloss': True
53
- 'ent_coef': 0.01
54
- 'vf_coef': 0.5
55
- 'max_grad_norm': 0.5
56
- 'target_kl': None
57
- 'repo_id': 'ahGadji/ppo-LunarLander-v2'
58
- 'batch_size': 512
59
- 'minibatch_size': 64}
60
- ```
61
-
 
1
  ---
2
+ library_name: stable-baselines3
3
  tags:
4
  - LunarLander-v2
 
5
  - deep-reinforcement-learning
6
  - reinforcement-learning
7
+ - stable-baselines3
 
8
  model-index:
9
+ - name: ppo
10
  results:
11
  - task:
12
  type: reinforcement-learning
 
16
  type: LunarLander-v2
17
  metrics:
18
  - type: mean_reward
19
+ value: 255.49 +/- 16.99
20
  name: mean_reward
21
  verified: false
22
  ---
23
 
24
+ # **ppo** Agent playing **LunarLander-v2**
25
+ This is a trained model of a **ppo** agent playing **LunarLander-v2**
26
+ using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
27
 
28
+ ## Usage (with Stable-baselines3)
29
+ TODO: Add your code
30
 
31
+
32
+ ```python
33
+ from stable_baselines3 import ...
34
+ from huggingface_sb3 import load_from_hub
35
+
36
+ ...
37
+ ```
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
config.json CHANGED
@@ -1 +1 @@
1
- {"policy_class": {":type:": "<class 'abc.ABCMeta'>", ":serialized:": "gAWVOwAAAAAAAACMIXN0YWJsZV9iYXNlbGluZXMzLmNvbW1vbi5wb2xpY2llc5SMEUFjdG9yQ3JpdGljUG9saWN5lJOULg==", "__module__": "stable_baselines3.common.policies", "__doc__": "\n Policy class for actor-critic algorithms (has both policy and value prediction).\n Used by A2C, PPO and the likes.\n\n :param observation_space: Observation space\n :param action_space: Action space\n :param lr_schedule: Learning rate schedule (could be constant)\n :param net_arch: The specification of the policy and value networks.\n :param activation_fn: Activation function\n :param ortho_init: Whether to use or not orthogonal initialization\n :param use_sde: Whether to use State Dependent Exploration or not\n :param log_std_init: Initial value for the log standard deviation\n :param full_std: Whether to use (n_features x n_actions) parameters\n for the std instead of only (n_features,) when using gSDE\n :param use_expln: Use ``expln()`` function instead of ``exp()`` to ensure\n a positive standard deviation (cf paper). It allows to keep variance\n above zero and prevent it from growing too fast. In practice, ``exp()`` is usually enough.\n :param squash_output: Whether to squash the output using a tanh function,\n this allows to ensure boundaries when using gSDE.\n :param features_extractor_class: Features extractor to use.\n :param features_extractor_kwargs: Keyword arguments\n to pass to the features extractor.\n :param share_features_extractor: If True, the features extractor is shared between the policy and value networks.\n :param normalize_images: Whether to normalize images or not,\n dividing by 255.0 (True by default)\n :param optimizer_class: The optimizer to use,\n ``th.optim.Adam`` by default\n :param optimizer_kwargs: Additional keyword arguments,\n excluding the learning rate, to pass to the optimizer\n ", "__init__": "<function ActorCriticPolicy.__init__ at 0x7e6929b0d240>", "_get_constructor_parameters": "<function ActorCriticPolicy._get_constructor_parameters at 0x7e6929b0d2d0>", "reset_noise": "<function ActorCriticPolicy.reset_noise at 0x7e6929b0d360>", "_build_mlp_extractor": "<function ActorCriticPolicy._build_mlp_extractor at 0x7e6929b0d3f0>", "_build": "<function ActorCriticPolicy._build at 0x7e6929b0d480>", "forward": "<function ActorCriticPolicy.forward at 0x7e6929b0d510>", "extract_features": "<function ActorCriticPolicy.extract_features at 0x7e6929b0d5a0>", "_get_action_dist_from_latent": "<function ActorCriticPolicy._get_action_dist_from_latent at 0x7e6929b0d630>", "_predict": "<function ActorCriticPolicy._predict at 0x7e6929b0d6c0>", "evaluate_actions": "<function ActorCriticPolicy.evaluate_actions at 0x7e6929b0d750>", "get_distribution": "<function ActorCriticPolicy.get_distribution at 0x7e6929b0d7e0>", "predict_values": "<function ActorCriticPolicy.predict_values at 0x7e6929b0d870>", "__abstractmethods__": "frozenset()", "_abc_impl": "<_abc._abc_data object at 0x7e6929aafc80>"}, "verbose": 1, "policy_kwargs": {}, "num_timesteps": 1015808, "_total_timesteps": 1000000, "_num_timesteps_at_start": 0, "seed": null, "action_noise": null, "start_time": 1712863917986603256, "learning_rate": 0.0003, "tensorboard_log": null, "_last_obs": {":type:": "<class 'numpy.ndarray'>", ":serialized:": "gAWVdQIAAAAAAACMEm51bXB5LmNvcmUubnVtZXJpY5SMC19mcm9tYnVmZmVylJOUKJYAAgAAAAAAAOaPCz5FMaA+zsvjvXMGdb4SnhA9OhpgvQAAAAAAAAAATfYbPR9l7LklwvA6Is4WNi22gDtcLg+6AACAPwAAgD/Tli8+9J6SP6O7Fj9SZ6i+eELcPC3PHz4AAAAAAAAAAOZfGT0UDJi6GketOrPJHDZ1I9w6nhTFuQAAgD8AAIA/YEMfPuuYOz8hQke9fv6QvuBPuTx5n8c8AAAAAAAAAAAAFlK99kQZul6/Xbu8pya2mxHDOr1GfzoAAIA/AACAPwCziL1c/2y66HDZuWXQXbboyyk7jUnQNQAAgD8AAIA/mt66PfdfIj9VTeO9WVRjvvCFhLvahW49AAAAAAAAAACa9AA94YizuhiGRDnxzyw0vRLUOP56YLgAAIA/AACAP82SLDyOHpg9+2uTvbd2bL4k4qC9KNhsPQAAAAAAAAAAMxaVvPacFbq+5+Y6b7XXNRIdVrvpowa6AACAPwAAgD/zzUG+Kb28PxpV5L4/iBi+5mWDvkpL870AAAAAAAAAAADGZrwbo+s9HUo/vczlVb4j4K+8w9h6PQAAAAAAAAAAmpi2PEgnqLq1+tQ6GpbCNYcsUjoLpvO5AACAPwAAgD9NOhG9SL+IuqoKLTqO/4g1ajiUOsLJRrkAAIA/AACAP810ADyPji26eDDXuw6YxLLxlAu4kphxsQAAgD8AAIA/lIwFbnVtcHmUjAVkdHlwZZSTlIwCZjSUiYiHlFKUKEsDjAE8lE5OTkr/////Sv////9LAHSUYksQSwiGlIwBQ5R0lFKULg=="}, "_last_episode_starts": {":type:": "<class 'numpy.ndarray'>", ":serialized:": "gAWVgwAAAAAAAACMEm51bXB5LmNvcmUubnVtZXJpY5SMC19mcm9tYnVmZmVylJOUKJYQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAACUjAVudW1weZSMBWR0eXBllJOUjAJiMZSJiIeUUpQoSwOMAXyUTk5OSv////9K/////0sAdJRiSxCFlIwBQ5R0lFKULg=="}, "_last_original_obs": null, "_episode_num": 0, "use_sde": false, "sde_sample_freq": -1, "_current_progress_remaining": -0.015808000000000044, "_stats_window_size": 100, "ep_info_buffer": {":type:": "<class 'collections.deque'>", ":serialized:": "gAWVQgwAAAAAAACMC2NvbGxlY3Rpb25zlIwFZGVxdWWUk5QpS2SGlFKUKH2UKIwBcpRHQGa4S13MY/GMAWyUTegDjAF0lEdAkrKPIjnmrHV9lChoBkdAYcrsAvL5h2gHTegDaAhHQJLDj3i704B1fZQoaAZHQGOcN9ph4MZoB03oA2gIR0CS06hG6PKddX2UKGgGR0BhDGv2Xb/PaAdN6ANoCEdAktZG38XN1XV9lChoBkdAZFdKLbYbsGgHTegDaAhHQJLWcHQhOgx1fZQoaAZHQGGJeyRjjJdoB03oA2gIR0CS2VA6uGKydX2UKGgGR0BnSoIldC3PaAdN6ANoCEdAkttqoqCpWHV9lChoBkdAZLxlgc94eWgHTegDaAhHQJLh5zT4L1F1fZQoaAZHQGM9lr/KhctoB03oA2gIR0CS4s5GBnSOdX2UKGgGR0BjlPqX4TK1aAdN6ANoCEdAkuZ5Dqnm73V9lChoBkdAX3uQMhHLBGgHTegDaAhHQJLmnboKUml1fZQoaAZHQFunCo0hvBJoB03oA2gIR0CS6SbqyGBXdX2UKGgGR0BdyGMn7YTTaAdN6ANoCEdAkwBNDUmUn3V9lChoBkdAYJXFpfx+a2gHTegDaAhHQJMDC3w1BMV1fZQoaAZHQGVDOuJUHY9oB03oA2gIR0CTA4/rB0p3dX2UKGgGR0Bih0qSX+l1aAdN6ANoCEdAkwtHjp9qlHV9lChoBkdAXhTgl4TsY2gHTegDaAhHQJMMDLB9Cu51fZQoaAZHQGOHOUD+zdFoB03oA2gIR0CTGirzXjEOdX2UKGgGR0BelPUz9CNTaAdN6ANoCEdAkytjYNAkcHV9lChoBkdAYwFrTH80lGgHTegDaAhHQJMt7JPqLTB1fZQoaAZHQGM6A31jAi5oB03oA2gIR0CTLhBwMpgDdX2UKGgGR0BbjuhkAggYaAdN6ANoCEdAkzDKjFhod3V9lChoBkdAZX4VpKzzE2gHTegDaAhHQJMysMRYigV1fZQoaAZHQFniMUh3aBZoB03oA2gIR0CTOKgGr0aqdX2UKGgGR0BjFSEHt4RmaAdN6ANoCEdAkzmBRIjGDXV9lChoBkdAYhlHlOoHcGgHTegDaAhHQJM9De/Ho5h1fZQoaAZHQGCFACnxaxJoB03oA2gIR0CTPTt78ejmdX2UKGgGR0BhuNupCKJmaAdN6ANoCEdAkz+l7dBSk3V9lChoBkdAOeyv5gw482gHTT0BaAhHQJNA3nW8RL91fZQoaAZHQEKGWykbgj1oB01kAWgIR0CTVqER8MNMdX2UKGgGR0Bcfo1gpjMFaAdN6ANoCEdAk1b2cBltj3V9lChoBkdAY2PSYw7DEWgHTegDaAhHQJNZmwnpjc51fZQoaAZHQGRyHCwbEP1oB03oA2gIR0CTWjAM2FWXdX2UKGgGR0BEBWK/EfknaAdL9WgIR0CTYTNx2jfvdX2UKGgGR0Bh7f9pAUtaaAdN6ANoCEdAk2I/C/GlynV9lChoBkdAZZAwhW5pamgHTegDaAhHQJNi93dKujh1fZQoaAZHQF1Lj8UEgW9oB03oA2gIR0CTcBxtpEhJdX2UKGgGR0Bi4pYHPeHjaAdN6ANoCEdAk35Ke9SMtXV9lChoBkdAYqLCl7+kxmgHTegDaAhHQJOF2+evpyJ1fZQoaAZHQF4FKE384xVoB03oA2gIR0CTiCTUy57PdX2UKGgGR0BkM81fmcOLaAdN6ANoCEdAk45vZdv863V9lChoBkdAYYBKPGQ0XWgHTegDaAhHQJOPVwVCXyB1fZQoaAZHQF89ffoA4n5oB03oA2gIR0CTkt4VRDTjdX2UKGgGR0BfXvjXFtKqaAdN6ANoCEdAk5MBpg1FY3V9lChoBkdAZuIyJsO5KGgHTegDaAhHQJOWS1og3cZ1fZQoaAZHQGWpOZCv5gxoB03oA2gIR0CTmFT0xubadX2UKGgGR0BamhjOLR8daAdN6ANoCEdAk5iivcJtznV9lChoBkdAYtRzS1E3KmgHTegDaAhHQJOsX0se4kN1fZQoaAZHQGRWDjrAxi5oB03oA2gIR0CTrNhrFfiQdX2UKGgGR0Beh6BI4EOiaAdN6ANoCEdAk7UKInBtUHV9lChoBkdAYMBFPznRs2gHTegDaAhHQJO2RTZQHiZ1fZQoaAZHQGLAOBUaQ3hoB03oA2gIR0CTtwt6HCXQdX2UKGgGR0AgA/tY0VJuaAdNLwFoCEdAk7wMXenAI3V9lChoBkdAYqI+36Q/5mgHTegDaAhHQJPDJtpEhJR1fZQoaAZHQFioq8lHBk9oB03oA2gIR0CT0ZuuzQeFdX2UKGgGR0BngEstkFwDaAdN6ANoCEdAk9cyIcinpHV9lChoBkdAY9s4SYgJTmgHTegDaAhHQJPZdeAuqWF1fZQoaAZHQGAu2wFC9h9oB03oA2gIR0CT4TdnTRYzdX2UKGgGR0BklCih37k5aAdN6ANoCEdAk+KHyd4FA3V9lChoBkdAY6Kgow22omgHTegDaAhHQJPnyd5IH1R1fZQoaAZHQGgOM9KVY6poB03oA2gIR0CT5+3mV7hOdX2UKGgGR0Be4Jgb6xgRaAdN6ANoCEdAk+uIs7MgU3V9lChoBkdAYeocCHRCyGgHTegDaAhHQJPuBRyfcvd1fZQoaAZHQGAgnX/YJ3RoB03oA2gIR0CUAjmUnogWdX2UKGgGR0BiYeZJCjUNaAdN6ANoCEdAlAK+3H7xeHV9lChoBkdAY5l/oaDPGGgHTegDaAhHQJQJIMEzO5d1fZQoaAZHQGbRrbYbsGBoB03oA2gIR0CUCgg7o0Q9dX2UKGgGR0Bktj48EFGHaAdN6ANoCEdAlAquq//Nq3V9lChoBkdAW5Hwvxpco2gHTegDaAhHQJQP6G47Rv51fZQoaAZHQGKG08vEjxFoB03oA2gIR0CUGNqtozvadX2UKGgGR0BhW8tGus90aAdN6ANoCEdAlCZ6DGtITXV9lChoBkdAXRSiEg4ffWgHTegDaAhHQJQsTMKTjed1fZQoaAZHQD5w4iosI3RoB00mAWgIR0CULfp5eJHidX2UKGgGR0Bl8PTmW+oMaAdN6ANoCEdAlC5ysS00FnV9lChoBkdAXqezhP0qY2gHTegDaAhHQJQ0jv0AcT91fZQoaAZHQGa5al+EytVoB03oA2gIR0CUNWPk7wKCdX2UKGgGR0BllVqveP7vaAdN6ANoCEdAlDipgXuVo3V9lChoBkdAZXG9GI9C/2gHTegDaAhHQJQ4yI68xsV1fZQoaAZHQGXUTHS4OMFoB03oA2gIR0CUPBkFfReDdX2UKGgGR0Bbar9l2/zraAdN6ANoCEdAlD55gkTpPnV9lChoBkdAYjd8xbjcVWgHTegDaAhHQJRBee+VTrF1fZQoaAZHQGBrM5GSZBtoB03oA2gIR0CUQiLJjlPrdX2UKGgGR0BhU9DBuXNUaAdN6ANoCEdAlFsgmNR3vHV9lChoBkdAYUJl+3H7xmgHTegDaAhHQJRcDFxXGOx1fZQoaAZHQGPDK2a2F39oB03oA2gIR0CUXLb212JSdX2UKGgGR0BK5khA4XGfaAdNFgFoCEdAlGD+sYEW7HV9lChoBkdAZPSzN2TxG2gHTegDaAhHQJRiItthuwZ1fZQoaAZHQCImXmeUY9BoB00tAWgIR0CUbpKJ2t+1dX2UKGgGR0BI/1Fpfx+baAdL7GgIR0CUb/qQzUI+dX2UKGgGR0BhDp1q33HraAdN6ANoCEdAlHgCaNMoMXV9lChoBkdAYFwbhFVktmgHTegDaAhHQJR9CKMvRJF1fZQoaAZHQGH/HqeK8+RoB03oA2gIR0CUfpJ4B3iadX2UKGgGR0BdUZqh11W9aAdN6ANoCEdAlH8CDRMN+nV9lChoBkdAYNL9tMwlB2gHTegDaAhHQJSEvh99c8l1fZQoaAZHQGOGXta6jFhoB03oA2gIR0CUhYdwvQF+dX2UKGgGR0Bg1Vq33HrAaAdN6ANoCEdAlIkI33pOe3V9lChoBkdAYqhOlfqoqGgHTegDaAhHQJSJKa2F36h1fZQoaAZHQGDvyoXKr7xoB03oA2gIR0CUjMcNYr8SdX2UKGgGR0BihLujRD1HaAdN6ANoCEdAlJKZv5xionV9lChoBkdAZwFVXFLnLmgHTegDaAhHQJSTMxREWqN1ZS4="}, "ep_success_buffer": {":type:": "<class 'collections.deque'>", ":serialized:": "gAWVIAAAAAAAAACMC2NvbGxlY3Rpb25zlIwFZGVxdWWUk5QpS2SGlFKULg=="}, "_n_updates": 248, "observation_space": {":type:": "<class 'gymnasium.spaces.box.Box'>", ":serialized:": "gAWVdgIAAAAAAACMFGd5bW5hc2l1bS5zcGFjZXMuYm94lIwDQm94lJOUKYGUfZQojAVkdHlwZZSMBW51bXB5lIwFZHR5cGWUk5SMAmY0lImIh5RSlChLA4wBPJROTk5K/////0r/////SwB0lGKMDWJvdW5kZWRfYmVsb3eUjBJudW1weS5jb3JlLm51bWVyaWOUjAtfZnJvbWJ1ZmZlcpSTlCiWCAAAAAAAAAABAQEBAQEBAZRoCIwCYjGUiYiHlFKUKEsDjAF8lE5OTkr/////Sv////9LAHSUYksIhZSMAUOUdJRSlIwNYm91bmRlZF9hYm92ZZRoESiWCAAAAAAAAAABAQEBAQEBAZRoFUsIhZRoGXSUUpSMBl9zaGFwZZRLCIWUjANsb3eUaBEoliAAAAAAAAAAAAC0wgAAtMIAAKDAAACgwNsPScAAAKDAAAAAgAAAAICUaAtLCIWUaBl0lFKUjARoaWdolGgRKJYgAAAAAAAAAAAAtEIAALRCAACgQAAAoEDbD0lAAACgQAAAgD8AAIA/lGgLSwiFlGgZdJRSlIwIbG93X3JlcHKUjFtbLTkwLiAgICAgICAgLTkwLiAgICAgICAgIC01LiAgICAgICAgIC01LiAgICAgICAgIC0zLjE0MTU5MjcgIC01LgogIC0wLiAgICAgICAgIC0wLiAgICAgICBdlIwJaGlnaF9yZXBylIxTWzkwLiAgICAgICAgOTAuICAgICAgICAgNS4gICAgICAgICA1LiAgICAgICAgIDMuMTQxNTkyNyAgNS4KICAxLiAgICAgICAgIDEuICAgICAgIF2UjApfbnBfcmFuZG9tlE51Yi4=", "dtype": "float32", "bounded_below": "[ True True True True True True True True]", "bounded_above": "[ True True True True True True True True]", "_shape": [8], "low": "[-90. -90. -5. -5. -3.1415927 -5.\n -0. -0. ]", "high": "[90. 90. 5. 5. 3.1415927 5.\n 1. 1. ]", "low_repr": "[-90. -90. -5. -5. -3.1415927 -5.\n -0. -0. ]", "high_repr": "[90. 90. 5. 5. 3.1415927 5.\n 1. 1. ]", "_np_random": null}, "action_space": {":type:": "<class 'gymnasium.spaces.discrete.Discrete'>", ":serialized:": "gAWV2wAAAAAAAACMGWd5bW5hc2l1bS5zcGFjZXMuZGlzY3JldGWUjAhEaXNjcmV0ZZSTlCmBlH2UKIwBbpSMFW51bXB5LmNvcmUubXVsdGlhcnJheZSMBnNjYWxhcpSTlIwFbnVtcHmUjAVkdHlwZZSTlIwCaTiUiYiHlFKUKEsDjAE8lE5OTkr/////Sv////9LAHSUYkMIBAAAAAAAAACUhpRSlIwFc3RhcnSUaAhoDkMIAAAAAAAAAACUhpRSlIwGX3NoYXBllCmMBWR0eXBllGgOjApfbnBfcmFuZG9tlE51Yi4=", "n": "4", "start": "0", "_shape": [], "dtype": "int64", "_np_random": null}, "n_envs": 16, "n_steps": 1024, "gamma": 0.999, "gae_lambda": 0.98, "ent_coef": 0.01, "vf_coef": 0.5, "max_grad_norm": 0.5, "batch_size": 64, "n_epochs": 4, "clip_range": {":type:": "<class 'function'>", ":serialized:": "gAWVxQIAAAAAAACMF2Nsb3VkcGlja2xlLmNsb3VkcGlja2xllIwOX21ha2VfZnVuY3Rpb26Uk5QoaACMDV9idWlsdGluX3R5cGWUk5SMCENvZGVUeXBllIWUUpQoSwFLAEsASwFLAUsTQwSIAFMAlE6FlCmMAV+UhZSMSS91c3IvbG9jYWwvbGliL3B5dGhvbjMuMTAvZGlzdC1wYWNrYWdlcy9zdGFibGVfYmFzZWxpbmVzMy9jb21tb24vdXRpbHMucHmUjARmdW5jlEuEQwIEAZSMA3ZhbJSFlCl0lFKUfZQojAtfX3BhY2thZ2VfX5SMGHN0YWJsZV9iYXNlbGluZXMzLmNvbW1vbpSMCF9fbmFtZV9flIwec3RhYmxlX2Jhc2VsaW5lczMuY29tbW9uLnV0aWxzlIwIX19maWxlX1+UjEkvdXNyL2xvY2FsL2xpYi9weXRob24zLjEwL2Rpc3QtcGFja2FnZXMvc3RhYmxlX2Jhc2VsaW5lczMvY29tbW9uL3V0aWxzLnB5lHVOTmgAjBBfbWFrZV9lbXB0eV9jZWxslJOUKVKUhZR0lFKUjBxjbG91ZHBpY2tsZS5jbG91ZHBpY2tsZV9mYXN0lIwSX2Z1bmN0aW9uX3NldHN0YXRllJOUaB99lH2UKGgWaA2MDF9fcXVhbG5hbWVfX5SMGWNvbnN0YW50X2ZuLjxsb2NhbHM+LmZ1bmOUjA9fX2Fubm90YXRpb25zX1+UfZSMDl9fa3dkZWZhdWx0c19flE6MDF9fZGVmYXVsdHNfX5ROjApfX21vZHVsZV9flGgXjAdfX2RvY19flE6MC19fY2xvc3VyZV9flGgAjApfbWFrZV9jZWxslJOURz/JmZmZmZmahZRSlIWUjBdfY2xvdWRwaWNrbGVfc3VibW9kdWxlc5RdlIwLX19nbG9iYWxzX1+UfZR1hpSGUjAu"}, "clip_range_vf": null, "normalize_advantage": true, "target_kl": null, "lr_schedule": {":type:": "<class 'function'>", ":serialized:": "gAWVxQIAAAAAAACMF2Nsb3VkcGlja2xlLmNsb3VkcGlja2xllIwOX21ha2VfZnVuY3Rpb26Uk5QoaACMDV9idWlsdGluX3R5cGWUk5SMCENvZGVUeXBllIWUUpQoSwFLAEsASwFLAUsTQwSIAFMAlE6FlCmMAV+UhZSMSS91c3IvbG9jYWwvbGliL3B5dGhvbjMuMTAvZGlzdC1wYWNrYWdlcy9zdGFibGVfYmFzZWxpbmVzMy9jb21tb24vdXRpbHMucHmUjARmdW5jlEuEQwIEAZSMA3ZhbJSFlCl0lFKUfZQojAtfX3BhY2thZ2VfX5SMGHN0YWJsZV9iYXNlbGluZXMzLmNvbW1vbpSMCF9fbmFtZV9flIwec3RhYmxlX2Jhc2VsaW5lczMuY29tbW9uLnV0aWxzlIwIX19maWxlX1+UjEkvdXNyL2xvY2FsL2xpYi9weXRob24zLjEwL2Rpc3QtcGFja2FnZXMvc3RhYmxlX2Jhc2VsaW5lczMvY29tbW9uL3V0aWxzLnB5lHVOTmgAjBBfbWFrZV9lbXB0eV9jZWxslJOUKVKUhZR0lFKUjBxjbG91ZHBpY2tsZS5jbG91ZHBpY2tsZV9mYXN0lIwSX2Z1bmN0aW9uX3NldHN0YXRllJOUaB99lH2UKGgWaA2MDF9fcXVhbG5hbWVfX5SMGWNvbnN0YW50X2ZuLjxsb2NhbHM+LmZ1bmOUjA9fX2Fubm90YXRpb25zX1+UfZSMDl9fa3dkZWZhdWx0c19flE6MDF9fZGVmYXVsdHNfX5ROjApfX21vZHVsZV9flGgXjAdfX2RvY19flE6MC19fY2xvc3VyZV9flGgAjApfbWFrZV9jZWxslJOURz8zqSowVTJhhZRSlIWUjBdfY2xvdWRwaWNrbGVfc3VibW9kdWxlc5RdlIwLX19nbG9iYWxzX1+UfZR1hpSGUjAu"}, "system_info": {"OS": "Linux-6.1.58+-x86_64-with-glibc2.35 # 1 SMP PREEMPT_DYNAMIC Sat Nov 18 15:31:17 UTC 2023", "Python": "3.10.12", "Stable-Baselines3": "2.0.0a5", "PyTorch": "2.2.1+cu121", "GPU Enabled": "True", "Numpy": "1.25.2", "Cloudpickle": "2.2.1", "Gymnasium": "0.28.1", "OpenAI Gym": "0.25.2"}}
 
1
+ {"policy_class": {":type:": "<class 'abc.ABCMeta'>", ":serialized:": "gAWVOwAAAAAAAACMIXN0YWJsZV9iYXNlbGluZXMzLmNvbW1vbi5wb2xpY2llc5SMEUFjdG9yQ3JpdGljUG9saWN5lJOULg==", "__module__": "stable_baselines3.common.policies", "__doc__": "\n Policy class for actor-critic algorithms (has both policy and value prediction).\n Used by A2C, PPO and the likes.\n\n :param observation_space: Observation space\n :param action_space: Action space\n :param lr_schedule: Learning rate schedule (could be constant)\n :param net_arch: The specification of the policy and value networks.\n :param activation_fn: Activation function\n :param ortho_init: Whether to use or not orthogonal initialization\n :param use_sde: Whether to use State Dependent Exploration or not\n :param log_std_init: Initial value for the log standard deviation\n :param full_std: Whether to use (n_features x n_actions) parameters\n for the std instead of only (n_features,) when using gSDE\n :param use_expln: Use ``expln()`` function instead of ``exp()`` to ensure\n a positive standard deviation (cf paper). It allows to keep variance\n above zero and prevent it from growing too fast. In practice, ``exp()`` is usually enough.\n :param squash_output: Whether to squash the output using a tanh function,\n this allows to ensure boundaries when using gSDE.\n :param features_extractor_class: Features extractor to use.\n :param features_extractor_kwargs: Keyword arguments\n to pass to the features extractor.\n :param share_features_extractor: If True, the features extractor is shared between the policy and value networks.\n :param normalize_images: Whether to normalize images or not,\n dividing by 255.0 (True by default)\n :param optimizer_class: The optimizer to use,\n ``th.optim.Adam`` by default\n :param optimizer_kwargs: Additional keyword arguments,\n excluding the learning rate, to pass to the optimizer\n ", "__init__": "<function ActorCriticPolicy.__init__ at 0x7e4adade85e0>", "_get_constructor_parameters": "<function ActorCriticPolicy._get_constructor_parameters at 0x7e4adade8670>", "reset_noise": "<function ActorCriticPolicy.reset_noise at 0x7e4adade8700>", "_build_mlp_extractor": "<function ActorCriticPolicy._build_mlp_extractor at 0x7e4adade8790>", "_build": "<function ActorCriticPolicy._build at 0x7e4adade8820>", "forward": "<function ActorCriticPolicy.forward at 0x7e4adade88b0>", "extract_features": "<function ActorCriticPolicy.extract_features at 0x7e4adade8940>", "_get_action_dist_from_latent": "<function ActorCriticPolicy._get_action_dist_from_latent at 0x7e4adade89d0>", "_predict": "<function ActorCriticPolicy._predict at 0x7e4adade8a60>", "evaluate_actions": "<function ActorCriticPolicy.evaluate_actions at 0x7e4adade8af0>", "get_distribution": "<function ActorCriticPolicy.get_distribution at 0x7e4adade8b80>", "predict_values": "<function ActorCriticPolicy.predict_values at 0x7e4adade8c10>", "__abstractmethods__": "frozenset()", "_abc_impl": "<_abc._abc_data object at 0x7e4adadfc7c0>"}, "verbose": 1, "policy_kwargs": {}, "num_timesteps": 1015808, "_total_timesteps": 1000000, "_num_timesteps_at_start": 0, "seed": null, "action_noise": null, "start_time": 1717796620904468447, "learning_rate": 0.0003, "tensorboard_log": null, "_last_obs": {":type:": "<class 'numpy.ndarray'>", ":serialized:": "gAWVdQIAAAAAAACMEm51bXB5LmNvcmUubnVtZXJpY5SMC19mcm9tYnVmZmVylJOUKJYAAgAAAAAAAJOQWj43xD8/FXJaPQAfuL67Sgs+kywpvQAAAAAAAAAAZlLGPBSMsbq+xc68fPKjPCcXqLqAro09AACAPwAAgD9meQ09MXGtP3tpfD2A2qy+k6OTunfupTwAAAAAAAAAAK3EHD40WmY/+6uePQcY2L6Pg6k9AB1xvQAAAAAAAAAAzVRWPhVw8T5uF2O+bw2OvjwPGbyC74a8AAAAAAAAAACzb7c9QylKvMqFh72a5y88v3ivPff5E70AAAAAAAAAAJrxtTs/a2w/ujvHPRkYlb7gXN28GVQKPgAAAAAAAAAAAFTOvVsTsz+pcbG+FPO6vu3t670DtDG+AAAAAAAAAAAzJbu8rpWfulYMbrZE5W6xHN6qOc5ijjUAAIA/AACAPy1UDL7qyh8/4ffFPcPnb77bpRC9IpyYPQAAAAAAAAAAVkpevpxyHz9xFQQ+U+p8vgRHbr0miQY9AAAAAAAAAAAAVnS8hdT6u11B5rsBgaw8sYBHve5Zjz0AAIA/AACAPzqPNz5ftmA/IuEYPVX+tb4YMes9KHaruQAAAAAAAAAAYjOUvszxYz+DSFA8DaaFvvlhQr53FhO9AAAAAAAAAADm7lC9M32vPp/BFz5nbWy+h/IJOy/egzwAAAAAAAAAAGa6Or2IM4+8m5nDvFLrZrzZK3i8dtOVuwAAgD8AAIA/lIwFbnVtcHmUjAVkdHlwZZSTlIwCZjSUiYiHlFKUKEsDjAE8lE5OTkr/////Sv////9LAHSUYksQSwiGlIwBQ5R0lFKULg=="}, "_last_episode_starts": {":type:": "<class 'numpy.ndarray'>", ":serialized:": "gAWVgwAAAAAAAACMEm51bXB5LmNvcmUubnVtZXJpY5SMC19mcm9tYnVmZmVylJOUKJYQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAACUjAVudW1weZSMBWR0eXBllJOUjAJiMZSJiIeUUpQoSwOMAXyUTk5OSv////9K/////0sAdJRiSxCFlIwBQ5R0lFKULg=="}, "_last_original_obs": null, "_episode_num": 0, "use_sde": false, "sde_sample_freq": -1, "_current_progress_remaining": -0.015808000000000044, "_stats_window_size": 100, "ep_info_buffer": {":type:": "<class 'collections.deque'>", ":serialized:": "gAWVNgwAAAAAAACMC2NvbGxlY3Rpb25zlIwFZGVxdWWUk5QpS2SGlFKUKH2UKIwBcpRHQHAnGxQizLSMAWyUTRoBjAF0lEdAlL/LZ8KG+XV9lChoBkdAcRRxVhkRSWgHTRoBaAhHQJTABW2gFot1fZQoaAZHQHLbpWV/tppoB01MAWgIR0CUwCkEcKgJdX2UKGgGR0BNqARbr1M/aAdL3mgIR0CUwHXC0ngHdX2UKGgGR0BvsLpiZv1laAdNQAFoCEdAlMHi5/b0v3V9lChoBkdAULXTkQwsXmgHS+9oCEdAlMHrfUF0P3V9lChoBkdAcLdHEMspX2gHTUcBaAhHQJTCPalDWsl1fZQoaAZHQHAns/QjUutoB00WAWgIR0CUwrbfP5YYdX2UKGgGR0BwrwF3Y+SsaAdNDAFoCEdAlMNlQ66renV9lChoBkdAcWOSKFZgX2gHTRwBaAhHQJTDzVRUFSt1fZQoaAZHQHHraBun/DNoB02hAWgIR0CUw93BpHqedX2UKGgGR0BCYZQgs9SuaAdL4mgIR0CUxWVlwtJ4dX2UKGgGR0BDMYPGyX2NaAdL4WgIR0CUxoiB5HEudX2UKGgGR0BFNaS1Vo6CaAdL7GgIR0CUxriw0O3EdX2UKGgGR0BwAUXWOIZZaAdNQgFoCEdAlMbBddE9dXV9lChoBkdAcF9t4A0bcWgHTQwBaAhHQJTHTfvWpZR1fZQoaAZHQHGXMS5AhStoB00sAWgIR0CUx1FYuCf6dX2UKGgGR0BxfC2lVLi/aAdNOwFoCEdAlMeoh6jWTXV9lChoBkdAb/DEG7jDK2gHTSABaAhHQJTImvIOpbV1fZQoaAZHQG7WWRigCfZoB01WAWgIR0CUyYGpuMuOdX2UKGgGR0Bw/KANG3F2aAdNIAFoCEdAlMoerhisn3V9lChoBkdAcU133Hq/umgHTQABaAhHQJTK1qCYkVx1fZQoaAZHQG3RaVMVUMpoB003AWgIR0CUyytelbeNdX2UKGgGR0BwmKZmZmZmaAdNOQFoCEdAlMu8495hSnV9lChoBkdAbTvlRP420mgHTTIBaAhHQJTM4b83uNR1fZQoaAZHQGz3XPqs2ehoB02TAWgIR0CUzZpwS8J2dX2UKGgGR0ByO2MKkVN6aAdNAgFoCEdAlM5pqZc9n3V9lChoBkdAcM3uQ6p5vGgHTS0BaAhHQJTOb/DLr5Z1fZQoaAZHQHFrOzlcQiBoB00eAWgIR0CUz0NVR1oydX2UKGgGR0BE3OMuOCGvaAdL5WgIR0CUz5lHjIaMdX2UKGgGR0BwDyPGQ0XQaAdNNwFoCEdAlM/fTG5tnHV9lChoBkdAclJVlf7aZmgHTTIBaAhHQJTQ+m2sq8V1fZQoaAZHQHDaRkd3jdZoB01MAWgIR0CU0V75mAbydX2UKGgGR0ByP3cxj8UFaAdNWgFoCEdAlNHB+fAbhnV9lChoBkdAcWJR9w3o92gHTSYBaAhHQJTSf8vVVgh1fZQoaAZHQHMEGW6bvw5oB00cAWgIR0CU0s9K28ZldX2UKGgGR0ArY7oSteUqaAdL9GgIR0CU0zvAoG6gdX2UKGgGR0ByP84YJmdzaAdNLwFoCEdAlNQLmuDBdnV9lChoBkdAUCzMotthu2gHS/loCEdAlNU8JIDoyXV9lChoBkdAcYse/5+H8GgHS/xoCEdAlNY8l9jPOnV9lChoBkdAbXdrdFfAsWgHTTUBaAhHQJTWjD1oQFt1fZQoaAZHQG6lLSmZVn5oB00fAWgIR0CU13jdHlOodX2UKGgGR0Bw8CSq2jO+aAdNDAFoCEdAlNgcAR02cnV9lChoBkdAcSg+MqBmPGgHTa0BaAhHQJTYy57PY4B1fZQoaAZHQEt0Z3s5XEJoB0vgaAhHQJTZBVYISlF1fZQoaAZHQHHyjibUgB9oB01LAWgIR0CU7tz3h4t6dX2UKGgGR0Bxrpy0a6z3aAdNMAFoCEdAlO++Sr5qM3V9lChoBkdAcIq0Yj0L+mgHTSoBaAhHQJTv9kxyn1p1fZQoaAZHQHBxClSCOFRoB00qAWgIR0CU8UMb3oLYdX2UKGgGR0Bv+oqiGnGbaAdNjAFoCEdAlPGbT6SDAnV9lChoBkdANHQnc+JP7GgHS61oCEdAlPIxA0Kqn3V9lChoBkdAUi19Tgl4T2gHS81oCEdAlPJYe1a4c3V9lChoBkdAb0z91EE1VGgHTUgBaAhHQJTyjHxSYPZ1fZQoaAZHQG3/cDB/I81oB00wAWgIR0CU8xLSeAd5dX2UKGgGR0BxDDNr0rbyaAdNXQFoCEdAlPOHfQ8fWHV9lChoBkdAcrEtALRa5mgHTSgBaAhHQJTzzjin5zp1fZQoaAZHQG6IcO09hZ1oB003AWgIR0CU9PUXHim3dX2UKGgGR0BwyWBreqJeaAdNCAFoCEdAlPWnDJlrdnV9lChoBkdAb17FiKBNEmgHTTkBaAhHQJT2idYnv2J1fZQoaAZHQGAsC8e0XxhoB03oA2gIR0CU90MaCL/CdX2UKGgGR0BxcP8+A3DOaAdNRwFoCEdAlPe3rhR64XV9lChoBkdAcG3AG0NSZWgHTSwBaAhHQJT3tV4oqkN1fZQoaAZHQG/O5EUj9n9oB00tAWgIR0CU+L71ZkkKdX2UKGgGR0ByTzobGWD6aAdNTQFoCEdAlPmHUYsND3V9lChoBkdAcf9QpWmxdWgHTRkBaAhHQJT5r6Fdszl1fZQoaAZHQHC4FMmF8G9oB00qAWgIR0CU+d/o7muDdX2UKGgGR0Br9Gjj7yhBaAdNMAFoCEdAlPtXcL0BfnV9lChoBkdAckqFdcB2fWgHTSsBaAhHQJT7xb1RLsd1fZQoaAZHQHJmCPEKmbdoB01PAWgIR0CU/BVjI7vHdX2UKGgGR0BtG7T6SDAaaAdNMAFoCEdAlPxv16E8JXV9lChoBkdAcWidpItlI2gHTXUBaAhHQJT9Cqgh8pl1fZQoaAZHQHEeA3kxREZoB009AWgIR0CU/SADq4YrdX2UKGgGR0BxsmowVTJhaAdNTAFoCEdAlP7oWUKRdXV9lChoBkdAcio850bLlmgHTTIBaAhHQJT+5hfBvaV1fZQoaAZHQG3rMkpqh11oB00nAWgIR0CVAEeXAuZkdX2UKGgGR0BxZbHwPRReaAdNQQFoCEdAlQBTO9nK4nV9lChoBkdAbaJf2K2rn2gHTS4BaAhHQJUA8JMQEp11fZQoaAZHQHKMvY4ACGNoB00SAWgIR0CVASLNwBHTdX2UKGgGR0Bu0fMbFS88aAdNTQFoCEdAlQHaIacZtXV9lChoBkdAcgMXq7iAD2gHTQ8BaAhHQJUCH7iyY5V1fZQoaAZHQG1NZEUj9n9oB00cAWgIR0CVAlB5X2dvdX2UKGgGR0BvGEz0pVjqaAdNJQFoCEdAlQJyaVlf7nV9lChoBkdARO+ktVaOgmgHS+1oCEdAlQMdVR1ox3V9lChoBkdAb3Kv7FbV0GgHTSMBaAhHQJUD7gTAWSF1fZQoaAZHQHCgocJdB0JoB006AWgIR0CVBPrwOOKgdX2UKGgGR0By1B7SiM5waAdNKAFoCEdAlQUY593KS3V9lChoBkdAb5U0pEx7A2gHTScBaAhHQJUFqGHpKSR1fZQoaAZHQG9XXiaRZEFoB00wAWgIR0CVBfrTH80ldX2UKGgGR0BtuReb/ffoaAdNFwFoCEdAlQbwUg0TDnV9lChoBkdAcAD/m1YyPGgHTSsBaAhHQJUHg/FBIFx1fZQoaAZHQHB6mT9sJppoB001AWgIR0CVCWGxlg+hdX2UKGgGR0Bt/r/wRXfZaAdNJgFoCEdAlQmRMvh60XV9lChoBkdAcXGWGh24eGgHTUEBaAhHQJUJwjhUBGR1fZQoaAZHQHCZznaFmFtoB00XAWgIR0CVCnJPZZjhdX2UKGgGR0A6GxcVxjriaAdL4GgIR0CVCuAbQ1JldX2UKGgGR0BuVypNsWO7aAdNIQFoCEdAlQsn003wTnV9lChoBkdAcEbegte2NWgHTTwBaAhHQJULZLZi/fx1fZQoaAZHQG+vBjOLR8doB01bAWgIR0CVC5aRp1zRdX2UKGgGR0BwTzy+Yc//aAdNSAFoCEdAlQwxqj8DS3V9lChoBkdAbsfJ04iosWgHTTcBaAhHQJUMmB9Tgl51ZS4="}, "ep_success_buffer": {":type:": "<class 'collections.deque'>", ":serialized:": "gAWVIAAAAAAAAACMC2NvbGxlY3Rpb25zlIwFZGVxdWWUk5QpS2SGlFKULg=="}, "_n_updates": 248, "observation_space": {":type:": "<class 'gymnasium.spaces.box.Box'>", ":serialized:": "gAWVdgIAAAAAAACMFGd5bW5hc2l1bS5zcGFjZXMuYm94lIwDQm94lJOUKYGUfZQojAVkdHlwZZSMBW51bXB5lIwFZHR5cGWUk5SMAmY0lImIh5RSlChLA4wBPJROTk5K/////0r/////SwB0lGKMDWJvdW5kZWRfYmVsb3eUjBJudW1weS5jb3JlLm51bWVyaWOUjAtfZnJvbWJ1ZmZlcpSTlCiWCAAAAAAAAAABAQEBAQEBAZRoCIwCYjGUiYiHlFKUKEsDjAF8lE5OTkr/////Sv////9LAHSUYksIhZSMAUOUdJRSlIwNYm91bmRlZF9hYm92ZZRoESiWCAAAAAAAAAABAQEBAQEBAZRoFUsIhZRoGXSUUpSMBl9zaGFwZZRLCIWUjANsb3eUaBEoliAAAAAAAAAAAAC0wgAAtMIAAKDAAACgwNsPScAAAKDAAAAAgAAAAICUaAtLCIWUaBl0lFKUjARoaWdolGgRKJYgAAAAAAAAAAAAtEIAALRCAACgQAAAoEDbD0lAAACgQAAAgD8AAIA/lGgLSwiFlGgZdJRSlIwIbG93X3JlcHKUjFtbLTkwLiAgICAgICAgLTkwLiAgICAgICAgIC01LiAgICAgICAgIC01LiAgICAgICAgIC0zLjE0MTU5MjcgIC01LgogIC0wLiAgICAgICAgIC0wLiAgICAgICBdlIwJaGlnaF9yZXBylIxTWzkwLiAgICAgICAgOTAuICAgICAgICAgNS4gICAgICAgICA1LiAgICAgICAgIDMuMTQxNTkyNyAgNS4KICAxLiAgICAgICAgIDEuICAgICAgIF2UjApfbnBfcmFuZG9tlE51Yi4=", "dtype": "float32", "bounded_below": "[ True True True True True True True True]", "bounded_above": "[ True True True True True True True True]", "_shape": [8], "low": "[-90. -90. -5. -5. -3.1415927 -5.\n -0. -0. ]", "high": "[90. 90. 5. 5. 3.1415927 5.\n 1. 1. ]", "low_repr": "[-90. -90. -5. -5. -3.1415927 -5.\n -0. -0. ]", "high_repr": "[90. 90. 5. 5. 3.1415927 5.\n 1. 1. ]", "_np_random": null}, "action_space": {":type:": "<class 'gymnasium.spaces.discrete.Discrete'>", ":serialized:": "gAWV2wAAAAAAAACMGWd5bW5hc2l1bS5zcGFjZXMuZGlzY3JldGWUjAhEaXNjcmV0ZZSTlCmBlH2UKIwBbpSMFW51bXB5LmNvcmUubXVsdGlhcnJheZSMBnNjYWxhcpSTlIwFbnVtcHmUjAVkdHlwZZSTlIwCaTiUiYiHlFKUKEsDjAE8lE5OTkr/////Sv////9LAHSUYkMIBAAAAAAAAACUhpRSlIwFc3RhcnSUaAhoDkMIAAAAAAAAAACUhpRSlIwGX3NoYXBllCmMBWR0eXBllGgOjApfbnBfcmFuZG9tlE51Yi4=", "n": "4", "start": "0", "_shape": [], "dtype": "int64", "_np_random": null}, "n_envs": 16, "n_steps": 1024, "gamma": 0.999, "gae_lambda": 0.98, "ent_coef": 0.01, "vf_coef": 0.5, "max_grad_norm": 0.5, "batch_size": 64, "n_epochs": 4, "clip_range": {":type:": "<class 'function'>", ":serialized:": "gAWVxQIAAAAAAACMF2Nsb3VkcGlja2xlLmNsb3VkcGlja2xllIwOX21ha2VfZnVuY3Rpb26Uk5QoaACMDV9idWlsdGluX3R5cGWUk5SMCENvZGVUeXBllIWUUpQoSwFLAEsASwFLAUsTQwSIAFMAlE6FlCmMAV+UhZSMSS91c3IvbG9jYWwvbGliL3B5dGhvbjMuMTAvZGlzdC1wYWNrYWdlcy9zdGFibGVfYmFzZWxpbmVzMy9jb21tb24vdXRpbHMucHmUjARmdW5jlEuEQwIEAZSMA3ZhbJSFlCl0lFKUfZQojAtfX3BhY2thZ2VfX5SMGHN0YWJsZV9iYXNlbGluZXMzLmNvbW1vbpSMCF9fbmFtZV9flIwec3RhYmxlX2Jhc2VsaW5lczMuY29tbW9uLnV0aWxzlIwIX19maWxlX1+UjEkvdXNyL2xvY2FsL2xpYi9weXRob24zLjEwL2Rpc3QtcGFja2FnZXMvc3RhYmxlX2Jhc2VsaW5lczMvY29tbW9uL3V0aWxzLnB5lHVOTmgAjBBfbWFrZV9lbXB0eV9jZWxslJOUKVKUhZR0lFKUjBxjbG91ZHBpY2tsZS5jbG91ZHBpY2tsZV9mYXN0lIwSX2Z1bmN0aW9uX3NldHN0YXRllJOUaB99lH2UKGgWaA2MDF9fcXVhbG5hbWVfX5SMGWNvbnN0YW50X2ZuLjxsb2NhbHM+LmZ1bmOUjA9fX2Fubm90YXRpb25zX1+UfZSMDl9fa3dkZWZhdWx0c19flE6MDF9fZGVmYXVsdHNfX5ROjApfX21vZHVsZV9flGgXjAdfX2RvY19flE6MC19fY2xvc3VyZV9flGgAjApfbWFrZV9jZWxslJOURz/JmZmZmZmahZRSlIWUjBdfY2xvdWRwaWNrbGVfc3VibW9kdWxlc5RdlIwLX19nbG9iYWxzX1+UfZR1hpSGUjAu"}, "clip_range_vf": null, "normalize_advantage": true, "target_kl": null, "lr_schedule": {":type:": "<class 'function'>", ":serialized:": "gAWVxQIAAAAAAACMF2Nsb3VkcGlja2xlLmNsb3VkcGlja2xllIwOX21ha2VfZnVuY3Rpb26Uk5QoaACMDV9idWlsdGluX3R5cGWUk5SMCENvZGVUeXBllIWUUpQoSwFLAEsASwFLAUsTQwSIAFMAlE6FlCmMAV+UhZSMSS91c3IvbG9jYWwvbGliL3B5dGhvbjMuMTAvZGlzdC1wYWNrYWdlcy9zdGFibGVfYmFzZWxpbmVzMy9jb21tb24vdXRpbHMucHmUjARmdW5jlEuEQwIEAZSMA3ZhbJSFlCl0lFKUfZQojAtfX3BhY2thZ2VfX5SMGHN0YWJsZV9iYXNlbGluZXMzLmNvbW1vbpSMCF9fbmFtZV9flIwec3RhYmxlX2Jhc2VsaW5lczMuY29tbW9uLnV0aWxzlIwIX19maWxlX1+UjEkvdXNyL2xvY2FsL2xpYi9weXRob24zLjEwL2Rpc3QtcGFja2FnZXMvc3RhYmxlX2Jhc2VsaW5lczMvY29tbW9uL3V0aWxzLnB5lHVOTmgAjBBfbWFrZV9lbXB0eV9jZWxslJOUKVKUhZR0lFKUjBxjbG91ZHBpY2tsZS5jbG91ZHBpY2tsZV9mYXN0lIwSX2Z1bmN0aW9uX3NldHN0YXRllJOUaB99lH2UKGgWaA2MDF9fcXVhbG5hbWVfX5SMGWNvbnN0YW50X2ZuLjxsb2NhbHM+LmZ1bmOUjA9fX2Fubm90YXRpb25zX1+UfZSMDl9fa3dkZWZhdWx0c19flE6MDF9fZGVmYXVsdHNfX5ROjApfX21vZHVsZV9flGgXjAdfX2RvY19flE6MC19fY2xvc3VyZV9flGgAjApfbWFrZV9jZWxslJOURz8zqSowVTJhhZRSlIWUjBdfY2xvdWRwaWNrbGVfc3VibW9kdWxlc5RdlIwLX19nbG9iYWxzX1+UfZR1hpSGUjAu"}, "system_info": {"OS": "Linux-6.1.85+-x86_64-with-glibc2.35 # 1 SMP PREEMPT_DYNAMIC Sun Apr 28 14:29:16 UTC 2024", "Python": "3.10.12", "Stable-Baselines3": "2.0.0a5", "PyTorch": "2.3.0+cu121", "GPU Enabled": "True", "Numpy": "1.25.2", "Cloudpickle": "2.2.1", "Gymnasium": "0.28.1", "OpenAI Gym": "0.25.2"}}
ppo-LunarLander-v2.zip CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:24aa0f5e19f4eedaeb3ffb955b86c4f255936d16cede810b3582dc85d3459792
3
- size 148084
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e7579c02cef943eaf7e94dd3f8f1226cfdf7c4e78123076ddf0ff235e49bc4d3
3
+ size 148068
ppo-LunarLander-v2/data CHANGED
@@ -4,20 +4,20 @@
4
  ":serialized:": "gAWVOwAAAAAAAACMIXN0YWJsZV9iYXNlbGluZXMzLmNvbW1vbi5wb2xpY2llc5SMEUFjdG9yQ3JpdGljUG9saWN5lJOULg==",
5
  "__module__": "stable_baselines3.common.policies",
6
  "__doc__": "\n Policy class for actor-critic algorithms (has both policy and value prediction).\n Used by A2C, PPO and the likes.\n\n :param observation_space: Observation space\n :param action_space: Action space\n :param lr_schedule: Learning rate schedule (could be constant)\n :param net_arch: The specification of the policy and value networks.\n :param activation_fn: Activation function\n :param ortho_init: Whether to use or not orthogonal initialization\n :param use_sde: Whether to use State Dependent Exploration or not\n :param log_std_init: Initial value for the log standard deviation\n :param full_std: Whether to use (n_features x n_actions) parameters\n for the std instead of only (n_features,) when using gSDE\n :param use_expln: Use ``expln()`` function instead of ``exp()`` to ensure\n a positive standard deviation (cf paper). It allows to keep variance\n above zero and prevent it from growing too fast. In practice, ``exp()`` is usually enough.\n :param squash_output: Whether to squash the output using a tanh function,\n this allows to ensure boundaries when using gSDE.\n :param features_extractor_class: Features extractor to use.\n :param features_extractor_kwargs: Keyword arguments\n to pass to the features extractor.\n :param share_features_extractor: If True, the features extractor is shared between the policy and value networks.\n :param normalize_images: Whether to normalize images or not,\n dividing by 255.0 (True by default)\n :param optimizer_class: The optimizer to use,\n ``th.optim.Adam`` by default\n :param optimizer_kwargs: Additional keyword arguments,\n excluding the learning rate, to pass to the optimizer\n ",
7
- "__init__": "<function ActorCriticPolicy.__init__ at 0x7e6929b0d240>",
8
- "_get_constructor_parameters": "<function ActorCriticPolicy._get_constructor_parameters at 0x7e6929b0d2d0>",
9
- "reset_noise": "<function ActorCriticPolicy.reset_noise at 0x7e6929b0d360>",
10
- "_build_mlp_extractor": "<function ActorCriticPolicy._build_mlp_extractor at 0x7e6929b0d3f0>",
11
- "_build": "<function ActorCriticPolicy._build at 0x7e6929b0d480>",
12
- "forward": "<function ActorCriticPolicy.forward at 0x7e6929b0d510>",
13
- "extract_features": "<function ActorCriticPolicy.extract_features at 0x7e6929b0d5a0>",
14
- "_get_action_dist_from_latent": "<function ActorCriticPolicy._get_action_dist_from_latent at 0x7e6929b0d630>",
15
- "_predict": "<function ActorCriticPolicy._predict at 0x7e6929b0d6c0>",
16
- "evaluate_actions": "<function ActorCriticPolicy.evaluate_actions at 0x7e6929b0d750>",
17
- "get_distribution": "<function ActorCriticPolicy.get_distribution at 0x7e6929b0d7e0>",
18
- "predict_values": "<function ActorCriticPolicy.predict_values at 0x7e6929b0d870>",
19
  "__abstractmethods__": "frozenset()",
20
- "_abc_impl": "<_abc._abc_data object at 0x7e6929aafc80>"
21
  },
22
  "verbose": 1,
23
  "policy_kwargs": {},
@@ -26,12 +26,12 @@
26
  "_num_timesteps_at_start": 0,
27
  "seed": null,
28
  "action_noise": null,
29
- "start_time": 1712863917986603256,
30
  "learning_rate": 0.0003,
31
  "tensorboard_log": null,
32
  "_last_obs": {
33
  ":type:": "<class 'numpy.ndarray'>",
34
- ":serialized:": "gAWVdQIAAAAAAACMEm51bXB5LmNvcmUubnVtZXJpY5SMC19mcm9tYnVmZmVylJOUKJYAAgAAAAAAAOaPCz5FMaA+zsvjvXMGdb4SnhA9OhpgvQAAAAAAAAAATfYbPR9l7LklwvA6Is4WNi22gDtcLg+6AACAPwAAgD/Tli8+9J6SP6O7Fj9SZ6i+eELcPC3PHz4AAAAAAAAAAOZfGT0UDJi6GketOrPJHDZ1I9w6nhTFuQAAgD8AAIA/YEMfPuuYOz8hQke9fv6QvuBPuTx5n8c8AAAAAAAAAAAAFlK99kQZul6/Xbu8pya2mxHDOr1GfzoAAIA/AACAPwCziL1c/2y66HDZuWXQXbboyyk7jUnQNQAAgD8AAIA/mt66PfdfIj9VTeO9WVRjvvCFhLvahW49AAAAAAAAAACa9AA94YizuhiGRDnxzyw0vRLUOP56YLgAAIA/AACAP82SLDyOHpg9+2uTvbd2bL4k4qC9KNhsPQAAAAAAAAAAMxaVvPacFbq+5+Y6b7XXNRIdVrvpowa6AACAPwAAgD/zzUG+Kb28PxpV5L4/iBi+5mWDvkpL870AAAAAAAAAAADGZrwbo+s9HUo/vczlVb4j4K+8w9h6PQAAAAAAAAAAmpi2PEgnqLq1+tQ6GpbCNYcsUjoLpvO5AACAPwAAgD9NOhG9SL+IuqoKLTqO/4g1ajiUOsLJRrkAAIA/AACAP810ADyPji26eDDXuw6YxLLxlAu4kphxsQAAgD8AAIA/lIwFbnVtcHmUjAVkdHlwZZSTlIwCZjSUiYiHlFKUKEsDjAE8lE5OTkr/////Sv////9LAHSUYksQSwiGlIwBQ5R0lFKULg=="
35
  },
36
  "_last_episode_starts": {
37
  ":type:": "<class 'numpy.ndarray'>",
@@ -45,7 +45,7 @@
45
  "_stats_window_size": 100,
46
  "ep_info_buffer": {
47
  ":type:": "<class 'collections.deque'>",
48
- ":serialized:": "gAWVQgwAAAAAAACMC2NvbGxlY3Rpb25zlIwFZGVxdWWUk5QpS2SGlFKUKH2UKIwBcpRHQGa4S13MY/GMAWyUTegDjAF0lEdAkrKPIjnmrHV9lChoBkdAYcrsAvL5h2gHTegDaAhHQJLDj3i704B1fZQoaAZHQGOcN9ph4MZoB03oA2gIR0CS06hG6PKddX2UKGgGR0BhDGv2Xb/PaAdN6ANoCEdAktZG38XN1XV9lChoBkdAZFdKLbYbsGgHTegDaAhHQJLWcHQhOgx1fZQoaAZHQGGJeyRjjJdoB03oA2gIR0CS2VA6uGKydX2UKGgGR0BnSoIldC3PaAdN6ANoCEdAkttqoqCpWHV9lChoBkdAZLxlgc94eWgHTegDaAhHQJLh5zT4L1F1fZQoaAZHQGM9lr/KhctoB03oA2gIR0CS4s5GBnSOdX2UKGgGR0BjlPqX4TK1aAdN6ANoCEdAkuZ5Dqnm73V9lChoBkdAX3uQMhHLBGgHTegDaAhHQJLmnboKUml1fZQoaAZHQFunCo0hvBJoB03oA2gIR0CS6SbqyGBXdX2UKGgGR0BdyGMn7YTTaAdN6ANoCEdAkwBNDUmUn3V9lChoBkdAYJXFpfx+a2gHTegDaAhHQJMDC3w1BMV1fZQoaAZHQGVDOuJUHY9oB03oA2gIR0CTA4/rB0p3dX2UKGgGR0Bih0qSX+l1aAdN6ANoCEdAkwtHjp9qlHV9lChoBkdAXhTgl4TsY2gHTegDaAhHQJMMDLB9Cu51fZQoaAZHQGOHOUD+zdFoB03oA2gIR0CTGirzXjEOdX2UKGgGR0BelPUz9CNTaAdN6ANoCEdAkytjYNAkcHV9lChoBkdAYwFrTH80lGgHTegDaAhHQJMt7JPqLTB1fZQoaAZHQGM6A31jAi5oB03oA2gIR0CTLhBwMpgDdX2UKGgGR0BbjuhkAggYaAdN6ANoCEdAkzDKjFhod3V9lChoBkdAZX4VpKzzE2gHTegDaAhHQJMysMRYigV1fZQoaAZHQFniMUh3aBZoB03oA2gIR0CTOKgGr0aqdX2UKGgGR0BjFSEHt4RmaAdN6ANoCEdAkzmBRIjGDXV9lChoBkdAYhlHlOoHcGgHTegDaAhHQJM9De/Ho5h1fZQoaAZHQGCFACnxaxJoB03oA2gIR0CTPTt78ejmdX2UKGgGR0BhuNupCKJmaAdN6ANoCEdAkz+l7dBSk3V9lChoBkdAOeyv5gw482gHTT0BaAhHQJNA3nW8RL91fZQoaAZHQEKGWykbgj1oB01kAWgIR0CTVqER8MNMdX2UKGgGR0Bcfo1gpjMFaAdN6ANoCEdAk1b2cBltj3V9lChoBkdAY2PSYw7DEWgHTegDaAhHQJNZmwnpjc51fZQoaAZHQGRyHCwbEP1oB03oA2gIR0CTWjAM2FWXdX2UKGgGR0BEBWK/EfknaAdL9WgIR0CTYTNx2jfvdX2UKGgGR0Bh7f9pAUtaaAdN6ANoCEdAk2I/C/GlynV9lChoBkdAZZAwhW5pamgHTegDaAhHQJNi93dKujh1fZQoaAZHQF1Lj8UEgW9oB03oA2gIR0CTcBxtpEhJdX2UKGgGR0Bi4pYHPeHjaAdN6ANoCEdAk35Ke9SMtXV9lChoBkdAYqLCl7+kxmgHTegDaAhHQJOF2+evpyJ1fZQoaAZHQF4FKE384xVoB03oA2gIR0CTiCTUy57PdX2UKGgGR0BkM81fmcOLaAdN6ANoCEdAk45vZdv863V9lChoBkdAYYBKPGQ0XWgHTegDaAhHQJOPVwVCXyB1fZQoaAZHQF89ffoA4n5oB03oA2gIR0CTkt4VRDTjdX2UKGgGR0BfXvjXFtKqaAdN6ANoCEdAk5MBpg1FY3V9lChoBkdAZuIyJsO5KGgHTegDaAhHQJOWS1og3cZ1fZQoaAZHQGWpOZCv5gxoB03oA2gIR0CTmFT0xubadX2UKGgGR0BamhjOLR8daAdN6ANoCEdAk5iivcJtznV9lChoBkdAYtRzS1E3KmgHTegDaAhHQJOsX0se4kN1fZQoaAZHQGRWDjrAxi5oB03oA2gIR0CTrNhrFfiQdX2UKGgGR0Beh6BI4EOiaAdN6ANoCEdAk7UKInBtUHV9lChoBkdAYMBFPznRs2gHTegDaAhHQJO2RTZQHiZ1fZQoaAZHQGLAOBUaQ3hoB03oA2gIR0CTtwt6HCXQdX2UKGgGR0AgA/tY0VJuaAdNLwFoCEdAk7wMXenAI3V9lChoBkdAYqI+36Q/5mgHTegDaAhHQJPDJtpEhJR1fZQoaAZHQFioq8lHBk9oB03oA2gIR0CT0ZuuzQeFdX2UKGgGR0BngEstkFwDaAdN6ANoCEdAk9cyIcinpHV9lChoBkdAY9s4SYgJTmgHTegDaAhHQJPZdeAuqWF1fZQoaAZHQGAu2wFC9h9oB03oA2gIR0CT4TdnTRYzdX2UKGgGR0BklCih37k5aAdN6ANoCEdAk+KHyd4FA3V9lChoBkdAY6Kgow22omgHTegDaAhHQJPnyd5IH1R1fZQoaAZHQGgOM9KVY6poB03oA2gIR0CT5+3mV7hOdX2UKGgGR0Be4Jgb6xgRaAdN6ANoCEdAk+uIs7MgU3V9lChoBkdAYeocCHRCyGgHTegDaAhHQJPuBRyfcvd1fZQoaAZHQGAgnX/YJ3RoB03oA2gIR0CUAjmUnogWdX2UKGgGR0BiYeZJCjUNaAdN6ANoCEdAlAK+3H7xeHV9lChoBkdAY5l/oaDPGGgHTegDaAhHQJQJIMEzO5d1fZQoaAZHQGbRrbYbsGBoB03oA2gIR0CUCgg7o0Q9dX2UKGgGR0Bktj48EFGHaAdN6ANoCEdAlAquq//Nq3V9lChoBkdAW5Hwvxpco2gHTegDaAhHQJQP6G47Rv51fZQoaAZHQGKG08vEjxFoB03oA2gIR0CUGNqtozvadX2UKGgGR0BhW8tGus90aAdN6ANoCEdAlCZ6DGtITXV9lChoBkdAXRSiEg4ffWgHTegDaAhHQJQsTMKTjed1fZQoaAZHQD5w4iosI3RoB00mAWgIR0CULfp5eJHidX2UKGgGR0Bl8PTmW+oMaAdN6ANoCEdAlC5ysS00FnV9lChoBkdAXqezhP0qY2gHTegDaAhHQJQ0jv0AcT91fZQoaAZHQGa5al+EytVoB03oA2gIR0CUNWPk7wKCdX2UKGgGR0BllVqveP7vaAdN6ANoCEdAlDipgXuVo3V9lChoBkdAZXG9GI9C/2gHTegDaAhHQJQ4yI68xsV1fZQoaAZHQGXUTHS4OMFoB03oA2gIR0CUPBkFfReDdX2UKGgGR0Bbar9l2/zraAdN6ANoCEdAlD55gkTpPnV9lChoBkdAYjd8xbjcVWgHTegDaAhHQJRBee+VTrF1fZQoaAZHQGBrM5GSZBtoB03oA2gIR0CUQiLJjlPrdX2UKGgGR0BhU9DBuXNUaAdN6ANoCEdAlFsgmNR3vHV9lChoBkdAYUJl+3H7xmgHTegDaAhHQJRcDFxXGOx1fZQoaAZHQGPDK2a2F39oB03oA2gIR0CUXLb212JSdX2UKGgGR0BK5khA4XGfaAdNFgFoCEdAlGD+sYEW7HV9lChoBkdAZPSzN2TxG2gHTegDaAhHQJRiItthuwZ1fZQoaAZHQCImXmeUY9BoB00tAWgIR0CUbpKJ2t+1dX2UKGgGR0BI/1Fpfx+baAdL7GgIR0CUb/qQzUI+dX2UKGgGR0BhDp1q33HraAdN6ANoCEdAlHgCaNMoMXV9lChoBkdAYFwbhFVktmgHTegDaAhHQJR9CKMvRJF1fZQoaAZHQGH/HqeK8+RoB03oA2gIR0CUfpJ4B3iadX2UKGgGR0BdUZqh11W9aAdN6ANoCEdAlH8CDRMN+nV9lChoBkdAYNL9tMwlB2gHTegDaAhHQJSEvh99c8l1fZQoaAZHQGOGXta6jFhoB03oA2gIR0CUhYdwvQF+dX2UKGgGR0Bg1Vq33HrAaAdN6ANoCEdAlIkI33pOe3V9lChoBkdAYqhOlfqoqGgHTegDaAhHQJSJKa2F36h1fZQoaAZHQGDvyoXKr7xoB03oA2gIR0CUjMcNYr8SdX2UKGgGR0BihLujRD1HaAdN6ANoCEdAlJKZv5xionV9lChoBkdAZwFVXFLnLmgHTegDaAhHQJSTMxREWqN1ZS4="
49
  },
50
  "ep_success_buffer": {
51
  ":type:": "<class 'collections.deque'>",
 
4
  ":serialized:": "gAWVOwAAAAAAAACMIXN0YWJsZV9iYXNlbGluZXMzLmNvbW1vbi5wb2xpY2llc5SMEUFjdG9yQ3JpdGljUG9saWN5lJOULg==",
5
  "__module__": "stable_baselines3.common.policies",
6
  "__doc__": "\n Policy class for actor-critic algorithms (has both policy and value prediction).\n Used by A2C, PPO and the likes.\n\n :param observation_space: Observation space\n :param action_space: Action space\n :param lr_schedule: Learning rate schedule (could be constant)\n :param net_arch: The specification of the policy and value networks.\n :param activation_fn: Activation function\n :param ortho_init: Whether to use or not orthogonal initialization\n :param use_sde: Whether to use State Dependent Exploration or not\n :param log_std_init: Initial value for the log standard deviation\n :param full_std: Whether to use (n_features x n_actions) parameters\n for the std instead of only (n_features,) when using gSDE\n :param use_expln: Use ``expln()`` function instead of ``exp()`` to ensure\n a positive standard deviation (cf paper). It allows to keep variance\n above zero and prevent it from growing too fast. In practice, ``exp()`` is usually enough.\n :param squash_output: Whether to squash the output using a tanh function,\n this allows to ensure boundaries when using gSDE.\n :param features_extractor_class: Features extractor to use.\n :param features_extractor_kwargs: Keyword arguments\n to pass to the features extractor.\n :param share_features_extractor: If True, the features extractor is shared between the policy and value networks.\n :param normalize_images: Whether to normalize images or not,\n dividing by 255.0 (True by default)\n :param optimizer_class: The optimizer to use,\n ``th.optim.Adam`` by default\n :param optimizer_kwargs: Additional keyword arguments,\n excluding the learning rate, to pass to the optimizer\n ",
7
+ "__init__": "<function ActorCriticPolicy.__init__ at 0x7e4adade85e0>",
8
+ "_get_constructor_parameters": "<function ActorCriticPolicy._get_constructor_parameters at 0x7e4adade8670>",
9
+ "reset_noise": "<function ActorCriticPolicy.reset_noise at 0x7e4adade8700>",
10
+ "_build_mlp_extractor": "<function ActorCriticPolicy._build_mlp_extractor at 0x7e4adade8790>",
11
+ "_build": "<function ActorCriticPolicy._build at 0x7e4adade8820>",
12
+ "forward": "<function ActorCriticPolicy.forward at 0x7e4adade88b0>",
13
+ "extract_features": "<function ActorCriticPolicy.extract_features at 0x7e4adade8940>",
14
+ "_get_action_dist_from_latent": "<function ActorCriticPolicy._get_action_dist_from_latent at 0x7e4adade89d0>",
15
+ "_predict": "<function ActorCriticPolicy._predict at 0x7e4adade8a60>",
16
+ "evaluate_actions": "<function ActorCriticPolicy.evaluate_actions at 0x7e4adade8af0>",
17
+ "get_distribution": "<function ActorCriticPolicy.get_distribution at 0x7e4adade8b80>",
18
+ "predict_values": "<function ActorCriticPolicy.predict_values at 0x7e4adade8c10>",
19
  "__abstractmethods__": "frozenset()",
20
+ "_abc_impl": "<_abc._abc_data object at 0x7e4adadfc7c0>"
21
  },
22
  "verbose": 1,
23
  "policy_kwargs": {},
 
26
  "_num_timesteps_at_start": 0,
27
  "seed": null,
28
  "action_noise": null,
29
+ "start_time": 1717796620904468447,
30
  "learning_rate": 0.0003,
31
  "tensorboard_log": null,
32
  "_last_obs": {
33
  ":type:": "<class 'numpy.ndarray'>",
34
+ ":serialized:": "gAWVdQIAAAAAAACMEm51bXB5LmNvcmUubnVtZXJpY5SMC19mcm9tYnVmZmVylJOUKJYAAgAAAAAAAJOQWj43xD8/FXJaPQAfuL67Sgs+kywpvQAAAAAAAAAAZlLGPBSMsbq+xc68fPKjPCcXqLqAro09AACAPwAAgD9meQ09MXGtP3tpfD2A2qy+k6OTunfupTwAAAAAAAAAAK3EHD40WmY/+6uePQcY2L6Pg6k9AB1xvQAAAAAAAAAAzVRWPhVw8T5uF2O+bw2OvjwPGbyC74a8AAAAAAAAAACzb7c9QylKvMqFh72a5y88v3ivPff5E70AAAAAAAAAAJrxtTs/a2w/ujvHPRkYlb7gXN28GVQKPgAAAAAAAAAAAFTOvVsTsz+pcbG+FPO6vu3t670DtDG+AAAAAAAAAAAzJbu8rpWfulYMbrZE5W6xHN6qOc5ijjUAAIA/AACAPy1UDL7qyh8/4ffFPcPnb77bpRC9IpyYPQAAAAAAAAAAVkpevpxyHz9xFQQ+U+p8vgRHbr0miQY9AAAAAAAAAAAAVnS8hdT6u11B5rsBgaw8sYBHve5Zjz0AAIA/AACAPzqPNz5ftmA/IuEYPVX+tb4YMes9KHaruQAAAAAAAAAAYjOUvszxYz+DSFA8DaaFvvlhQr53FhO9AAAAAAAAAADm7lC9M32vPp/BFz5nbWy+h/IJOy/egzwAAAAAAAAAAGa6Or2IM4+8m5nDvFLrZrzZK3i8dtOVuwAAgD8AAIA/lIwFbnVtcHmUjAVkdHlwZZSTlIwCZjSUiYiHlFKUKEsDjAE8lE5OTkr/////Sv////9LAHSUYksQSwiGlIwBQ5R0lFKULg=="
35
  },
36
  "_last_episode_starts": {
37
  ":type:": "<class 'numpy.ndarray'>",
 
45
  "_stats_window_size": 100,
46
  "ep_info_buffer": {
47
  ":type:": "<class 'collections.deque'>",
48
+ ":serialized:": "gAWVNgwAAAAAAACMC2NvbGxlY3Rpb25zlIwFZGVxdWWUk5QpS2SGlFKUKH2UKIwBcpRHQHAnGxQizLSMAWyUTRoBjAF0lEdAlL/LZ8KG+XV9lChoBkdAcRRxVhkRSWgHTRoBaAhHQJTABW2gFot1fZQoaAZHQHLbpWV/tppoB01MAWgIR0CUwCkEcKgJdX2UKGgGR0BNqARbr1M/aAdL3mgIR0CUwHXC0ngHdX2UKGgGR0BvsLpiZv1laAdNQAFoCEdAlMHi5/b0v3V9lChoBkdAULXTkQwsXmgHS+9oCEdAlMHrfUF0P3V9lChoBkdAcLdHEMspX2gHTUcBaAhHQJTCPalDWsl1fZQoaAZHQHAns/QjUutoB00WAWgIR0CUwrbfP5YYdX2UKGgGR0BwrwF3Y+SsaAdNDAFoCEdAlMNlQ66renV9lChoBkdAcWOSKFZgX2gHTRwBaAhHQJTDzVRUFSt1fZQoaAZHQHHraBun/DNoB02hAWgIR0CUw93BpHqedX2UKGgGR0BCYZQgs9SuaAdL4mgIR0CUxWVlwtJ4dX2UKGgGR0BDMYPGyX2NaAdL4WgIR0CUxoiB5HEudX2UKGgGR0BFNaS1Vo6CaAdL7GgIR0CUxriw0O3EdX2UKGgGR0BwAUXWOIZZaAdNQgFoCEdAlMbBddE9dXV9lChoBkdAcF9t4A0bcWgHTQwBaAhHQJTHTfvWpZR1fZQoaAZHQHGXMS5AhStoB00sAWgIR0CUx1FYuCf6dX2UKGgGR0BxfC2lVLi/aAdNOwFoCEdAlMeoh6jWTXV9lChoBkdAb/DEG7jDK2gHTSABaAhHQJTImvIOpbV1fZQoaAZHQG7WWRigCfZoB01WAWgIR0CUyYGpuMuOdX2UKGgGR0Bw/KANG3F2aAdNIAFoCEdAlMoerhisn3V9lChoBkdAcU133Hq/umgHTQABaAhHQJTK1qCYkVx1fZQoaAZHQG3RaVMVUMpoB003AWgIR0CUyytelbeNdX2UKGgGR0BwmKZmZmZmaAdNOQFoCEdAlMu8495hSnV9lChoBkdAbTvlRP420mgHTTIBaAhHQJTM4b83uNR1fZQoaAZHQGz3XPqs2ehoB02TAWgIR0CUzZpwS8J2dX2UKGgGR0ByO2MKkVN6aAdNAgFoCEdAlM5pqZc9n3V9lChoBkdAcM3uQ6p5vGgHTS0BaAhHQJTOb/DLr5Z1fZQoaAZHQHFrOzlcQiBoB00eAWgIR0CUz0NVR1oydX2UKGgGR0BE3OMuOCGvaAdL5WgIR0CUz5lHjIaMdX2UKGgGR0BwDyPGQ0XQaAdNNwFoCEdAlM/fTG5tnHV9lChoBkdAclJVlf7aZmgHTTIBaAhHQJTQ+m2sq8V1fZQoaAZHQHDaRkd3jdZoB01MAWgIR0CU0V75mAbydX2UKGgGR0ByP3cxj8UFaAdNWgFoCEdAlNHB+fAbhnV9lChoBkdAcWJR9w3o92gHTSYBaAhHQJTSf8vVVgh1fZQoaAZHQHMEGW6bvw5oB00cAWgIR0CU0s9K28ZldX2UKGgGR0ArY7oSteUqaAdL9GgIR0CU0zvAoG6gdX2UKGgGR0ByP84YJmdzaAdNLwFoCEdAlNQLmuDBdnV9lChoBkdAUCzMotthu2gHS/loCEdAlNU8JIDoyXV9lChoBkdAcYse/5+H8GgHS/xoCEdAlNY8l9jPOnV9lChoBkdAbXdrdFfAsWgHTTUBaAhHQJTWjD1oQFt1fZQoaAZHQG6lLSmZVn5oB00fAWgIR0CU13jdHlOodX2UKGgGR0Bw8CSq2jO+aAdNDAFoCEdAlNgcAR02cnV9lChoBkdAcSg+MqBmPGgHTa0BaAhHQJTYy57PY4B1fZQoaAZHQEt0Z3s5XEJoB0vgaAhHQJTZBVYISlF1fZQoaAZHQHHyjibUgB9oB01LAWgIR0CU7tz3h4t6dX2UKGgGR0Bxrpy0a6z3aAdNMAFoCEdAlO++Sr5qM3V9lChoBkdAcIq0Yj0L+mgHTSoBaAhHQJTv9kxyn1p1fZQoaAZHQHBxClSCOFRoB00qAWgIR0CU8UMb3oLYdX2UKGgGR0Bv+oqiGnGbaAdNjAFoCEdAlPGbT6SDAnV9lChoBkdANHQnc+JP7GgHS61oCEdAlPIxA0Kqn3V9lChoBkdAUi19Tgl4T2gHS81oCEdAlPJYe1a4c3V9lChoBkdAb0z91EE1VGgHTUgBaAhHQJTyjHxSYPZ1fZQoaAZHQG3/cDB/I81oB00wAWgIR0CU8xLSeAd5dX2UKGgGR0BxDDNr0rbyaAdNXQFoCEdAlPOHfQ8fWHV9lChoBkdAcrEtALRa5mgHTSgBaAhHQJTzzjin5zp1fZQoaAZHQG6IcO09hZ1oB003AWgIR0CU9PUXHim3dX2UKGgGR0BwyWBreqJeaAdNCAFoCEdAlPWnDJlrdnV9lChoBkdAb17FiKBNEmgHTTkBaAhHQJT2idYnv2J1fZQoaAZHQGAsC8e0XxhoB03oA2gIR0CU90MaCL/CdX2UKGgGR0BxcP8+A3DOaAdNRwFoCEdAlPe3rhR64XV9lChoBkdAcG3AG0NSZWgHTSwBaAhHQJT3tV4oqkN1fZQoaAZHQG/O5EUj9n9oB00tAWgIR0CU+L71ZkkKdX2UKGgGR0ByTzobGWD6aAdNTQFoCEdAlPmHUYsND3V9lChoBkdAcf9QpWmxdWgHTRkBaAhHQJT5r6Fdszl1fZQoaAZHQHC4FMmF8G9oB00qAWgIR0CU+d/o7muDdX2UKGgGR0Br9Gjj7yhBaAdNMAFoCEdAlPtXcL0BfnV9lChoBkdAckqFdcB2fWgHTSsBaAhHQJT7xb1RLsd1fZQoaAZHQHJmCPEKmbdoB01PAWgIR0CU/BVjI7vHdX2UKGgGR0BtG7T6SDAaaAdNMAFoCEdAlPxv16E8JXV9lChoBkdAcWidpItlI2gHTXUBaAhHQJT9Cqgh8pl1fZQoaAZHQHEeA3kxREZoB009AWgIR0CU/SADq4YrdX2UKGgGR0BxsmowVTJhaAdNTAFoCEdAlP7oWUKRdXV9lChoBkdAcio850bLlmgHTTIBaAhHQJT+5hfBvaV1fZQoaAZHQG3rMkpqh11oB00nAWgIR0CVAEeXAuZkdX2UKGgGR0BxZbHwPRReaAdNQQFoCEdAlQBTO9nK4nV9lChoBkdAbaJf2K2rn2gHTS4BaAhHQJUA8JMQEp11fZQoaAZHQHKMvY4ACGNoB00SAWgIR0CVASLNwBHTdX2UKGgGR0Bu0fMbFS88aAdNTQFoCEdAlQHaIacZtXV9lChoBkdAcgMXq7iAD2gHTQ8BaAhHQJUCH7iyY5V1fZQoaAZHQG1NZEUj9n9oB00cAWgIR0CVAlB5X2dvdX2UKGgGR0BvGEz0pVjqaAdNJQFoCEdAlQJyaVlf7nV9lChoBkdARO+ktVaOgmgHS+1oCEdAlQMdVR1ox3V9lChoBkdAb3Kv7FbV0GgHTSMBaAhHQJUD7gTAWSF1fZQoaAZHQHCgocJdB0JoB006AWgIR0CVBPrwOOKgdX2UKGgGR0By1B7SiM5waAdNKAFoCEdAlQUY593KS3V9lChoBkdAb5U0pEx7A2gHTScBaAhHQJUFqGHpKSR1fZQoaAZHQG9XXiaRZEFoB00wAWgIR0CVBfrTH80ldX2UKGgGR0BtuReb/ffoaAdNFwFoCEdAlQbwUg0TDnV9lChoBkdAcAD/m1YyPGgHTSsBaAhHQJUHg/FBIFx1fZQoaAZHQHB6mT9sJppoB001AWgIR0CVCWGxlg+hdX2UKGgGR0Bt/r/wRXfZaAdNJgFoCEdAlQmRMvh60XV9lChoBkdAcXGWGh24eGgHTUEBaAhHQJUJwjhUBGR1fZQoaAZHQHCZznaFmFtoB00XAWgIR0CVCnJPZZjhdX2UKGgGR0A6GxcVxjriaAdL4GgIR0CVCuAbQ1JldX2UKGgGR0BuVypNsWO7aAdNIQFoCEdAlQsn003wTnV9lChoBkdAcEbegte2NWgHTTwBaAhHQJULZLZi/fx1fZQoaAZHQG+vBjOLR8doB01bAWgIR0CVC5aRp1zRdX2UKGgGR0BwTzy+Yc//aAdNSAFoCEdAlQwxqj8DS3V9lChoBkdAbsfJ04iosWgHTTcBaAhHQJUMmB9Tgl51ZS4="
49
  },
50
  "ep_success_buffer": {
51
  ":type:": "<class 'collections.deque'>",
ppo-LunarLander-v2/policy.optimizer.pth CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:7a3ada51f66df47fae5d33e220feb78112321193f6ddaa09bbdadece675f57ef
3
  size 88362
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0890086ace9364acd78acf7e66ee992a37e3794829b0e817f3e0b10143a41a9c
3
  size 88362
ppo-LunarLander-v2/policy.pth CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:e953c80f38193587ae69d23251026e3a72925677093a909aee172b74b3f62003
3
  size 43762
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fce5e1a7ddf1fc8c3ee9be166f67957176a4ffcd2bfa132e6205115b475e7d42
3
  size 43762
ppo-LunarLander-v2/system_info.txt CHANGED
@@ -1,7 +1,7 @@
1
- - OS: Linux-6.1.58+-x86_64-with-glibc2.35 # 1 SMP PREEMPT_DYNAMIC Sat Nov 18 15:31:17 UTC 2023
2
  - Python: 3.10.12
3
  - Stable-Baselines3: 2.0.0a5
4
- - PyTorch: 2.2.1+cu121
5
  - GPU Enabled: True
6
  - Numpy: 1.25.2
7
  - Cloudpickle: 2.2.1
 
1
+ - OS: Linux-6.1.85+-x86_64-with-glibc2.35 # 1 SMP PREEMPT_DYNAMIC Sun Apr 28 14:29:16 UTC 2024
2
  - Python: 3.10.12
3
  - Stable-Baselines3: 2.0.0a5
4
+ - PyTorch: 2.3.0+cu121
5
  - GPU Enabled: True
6
  - Numpy: 1.25.2
7
  - Cloudpickle: 2.2.1
replay.mp4 CHANGED
Binary files a/replay.mp4 and b/replay.mp4 differ
 
results.json CHANGED
@@ -1 +1 @@
1
- {"env_id": "LunarLander-v2", "mean_reward": 202.44502710351713, "std_reward": 64.47205682551407, "n_evaluation_episodes": 10, "eval_datetime": "2024-05-23T12:05:58.652636"}
 
1
+ {"mean_reward": 255.49086619999997, "std_reward": 16.99317949716753, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2024-06-07T22:06:22.237842"}