ppo-LunarLander-v2 / results.json
Bhaskarbha's picture
My first commit for the Lunar Lander v2 using PPO
56a7851
raw
history blame contribute delete
165 Bytes
{"mean_reward": 252.86181434938817, "std_reward": 23.676541344415376, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-01-11T03:31:59.759814"}