Max context length for new InternVL 3.5 models

#1
by brandonbeiler - opened

What is the max context length that these new models can support? I've seen a few references to 32k in readme's, but was curious what the expected number is.

OpenGVLab org

Thank you for your interest in our work. The maximum context length of our model is 32K.

Awesome! Good to know and thanks for the fast response. You guys crushed it with this release! Very excited to deploy the models.

Sign up or log in to comment