ppo-lunar-lander / results.json
jadermcs's picture
init
91cf499
raw
history blame contribute delete
164 Bytes
{"mean_reward": 274.8274609397559, "std_reward": 24.239644389195195, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-05-10T19:27:14.732985"}