PPO-Simulation / README.md
arad1367's picture
initial commit
ca323d4 verified
metadata
title: PPO Simulation
emoji: πŸ“š
colorFrom: yellow
colorTo: green
sdk: static
pinned: false
license: mit
short_description: 'This simulation demonstrates Proximal Policy Optimization '

Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference