ppo-LunarLander-v2 / results.json
RayanRen's picture
first version
7c46890
raw
history blame
161 Bytes
{"mean_reward": 166.1447185583593, "std_reward": 45.770376682327, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-12-27T21:02:43.684581"}