Load model from vision_encoder.onnx failed:Protobuf parsing failed.
#1
by
promistrio
- opened
Hello everyone!
I'm trying to run the model, but I encounter an error.
File "onnx/onnxrun.py", line 10, in <module>
vision_encoder = ort.InferenceSession("vision_encoder.onnx", providers=['CPUExecutionProvider'])
File "/home/orangepi/DEV/Florence2/florence_env/lib/python3.8/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 419, in __init__
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "/home/orangepi/DEV/Florence2/florence_env/lib/python3.8/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 460, in _create_inference_session
sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
onnxruntime.capi.onnxruntime_pybind11_state.InvalidProtobuf: [ONNXRuntimeError] : 7 : INVALID_PROTOBUF : Load model from vision_encoder.onnx failed:Protobuf parsing failed.
I would be glad if you could help me start using onnx first.
After that, when I ran convert.py it requered onnxscript, now it depend on onnx==1.17 but rknn-toolkit depend on onnx==1.14. It will be a next challenge
rknn-toolkit2 (v2.3.0)
- Check the ONNX files are actual model and not Git LFS pointer.
- Just install onnx==1.17 and ignore the dependency issue.
- Check the ONNX files are actual model and not Git LFS pointer.
- Just install onnx==1.17 and ignore the dependency issue.
Thanks, you are right. It was git LFS pointers. I fix it and got an another error
python onnx/onnxrun.py
Traceback (most recent call last):
File "onnx/onnxrun.py", line 10, in <module>
vision_encoder = ort.InferenceSession("vision_encoder.onnx", providers=['CPUExecutionProvider'])
File "/home/orangepi/DEV/Florence2/florence_env/lib/python3.8/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 419, in __init__
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "/home/orangepi/DEV/Florence2/florence_env/lib/python3.8/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 452, in _create_inference_session
sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from vision_encoder.onnx failed:/onnxruntime_src/onnxruntime/core/graph/model.cc:149 onnxruntime::Model::Model(onnx::ModelProto&&, const PathString&, const IOnnxRuntimeOpSchemaRegistryList*, const onnxruntime::logging::Logger&, const onnxruntime::ModelOptions&) Unsupported model IR version: 10, max supported IR version: 9
for rknn stay same error
python onnx/rknnrun.py
Traceback (most recent call last):
File "onnx/rknnrun.py", line 3, in <module>
from transformers import AutoProcessor
File "/home/orangepi/DEV/Florence2/florence_env/lib/python3.8/site-packages/transformers/__init__.py", line 26, in <module>
from . import dependency_versions_check
File "/home/orangepi/DEV/Florence2/florence_env/lib/python3.8/site-packages/transformers/dependency_versions_check.py", line 16, in <module>
from .utils.versions import require_version, require_version_core
File "/home/orangepi/DEV/Florence2/florence_env/lib/python3.8/site-packages/transformers/utils/__init__.py", line 27, in <module>
from .chat_template_utils import DocstringParsingException, TypeHintParsingException, get_json_schema
File "/home/orangepi/DEV/Florence2/florence_env/lib/python3.8/site-packages/transformers/utils/chat_template_utils.py", line 39, in <module>
from torch import Tensor
File "/home/orangepi/DEV/Florence2/florence_env/lib/python3.8/site-packages/torch/__init__.py", line 1823, in <module>
from torch import export as export
File "/home/orangepi/DEV/Florence2/florence_env/lib/python3.8/site-packages/torch/export/__init__.py", line 29, in <module>
from torch.fx.passes.infra.pass_base import PassResult
File "/home/orangepi/DEV/Florence2/florence_env/lib/python3.8/site-packages/torch/fx/passes/__init__.py", line 3, in <module>
from . import net_min_base
File "/home/orangepi/DEV/Florence2/florence_env/lib/python3.8/site-packages/torch/fx/passes/net_min_base.py", line 11, in <module>
from .split_utils import split_by_tags
File "/home/orangepi/DEV/Florence2/florence_env/lib/python3.8/site-packages/torch/fx/passes/split_utils.py", line 8, in <module>
from torch.fx.passes.utils import HolderModule, lift_subgraph_as_module
File "/home/orangepi/DEV/Florence2/florence_env/lib/python3.8/site-packages/torch/fx/passes/utils/__init__.py", line 1, in <module>
from .common import lift_subgraph_as_module, HolderModule, compare_graphs
File "/home/orangepi/DEV/Florence2/florence_env/lib/python3.8/site-packages/torch/fx/passes/utils/common.py", line 7, in <module>
from torch.fx.passes.utils.matcher_utils import SubgraphMatcher
File "/home/orangepi/DEV/Florence2/florence_env/lib/python3.8/site-packages/torch/fx/passes/utils/matcher_utils.py", line 31, in <module>
logger = _init_logger()
File "/home/orangepi/DEV/Florence2/florence_env/lib/python3.8/site-packages/torch/fx/passes/utils/matcher_utils.py", line 21, in _init_logger
logger.setLevel(level)
File "/usr/lib/python3.8/logging/__init__.py", line 1421, in setLevel
self.level = _checkLevel(level)
File "/usr/lib/python3.8/logging/__init__.py", line 198, in _checkLevel
raise ValueError("Unknown level: %r" % level)
ValueError: Unknown level: 'WARNING'
Thanks for help