PPO-SnowballTarget / config.json

Commit History

Training Unity ML Agent with PPO
92a88f1

odiaz1066 commited on