Performance bad in game client
I try to run the model in my own computer (RTX 5080), but after injecting into the game client. It became extremely slow.
I checked another post and said it will be fine if disconnect the 2nd monitor. I tried, but the problem still existed.
And I check the ./out debug video. The recorded video works fine. (maybe it's speeding up?)
Anyone can help? Thank you.
same situation on my RTX5090
I try to run the model in my own computer (RTX 5080), but after injecting into the game client. It became extremely slow.
I checked another post and said it will be fine if disconnect the 2nd monitor. I tried, but the problem still existed.
And I check the ./out debug video. The recorded video works fine. (maybe it's speeding up?)
Anyone can help? Thank you.
That's expected behavior. The AI works by artificially staggering the game speed. You can disable the staggering but the AI won't be able to keep up with the realtime game speed, making its inputs appear completely random.
NitroGen 1 was developed by NVIDIA and is the first model of the series. This model is for research and development only.
well perhaps we are mistaken, maybe we aren't meant to/allowed to play the game(s) and only research and develop it /s
to be actually helpful for the discussion though: let's say, hypothetically, you have two GPUs. can you use one for model inference and one to render things to play realtime without quantizing/making the quality worse?
to be actually helpful for the discussion though: let's say, hypothetically, you have two GPUs. can you use one for model inference and one to render things to play realtime without quantizing/making the quality worse?
I have a 3090 and a 3080, two graphics cards from the same manufacturer. I've tried turning off monitors, running on different graphics cards, and changing other settings, but I still get slow-motion gaming.
You can try disabling the recording of video and image files for debugging to improve performance slightly.