Export functions

Main functions

optimum.exporters.executorch.export_to_executorch

< >

( model: typing.Union[ForwardRef('PreTrainedModel'), ForwardRef('TorchExportableModuleWithStaticCache')] task: str recipe: str output_dir: typing.Union[str, pathlib.Path] **kwargs ) ExecuTorchProgram

Parameters

  • model (Union["PreTrainedModel", "TorchExportableModuleWithStaticCache"]) — A PyTorch model to be exported. This can be a standard HuggingFace PreTrainedModel or a wrapped module like TorchExportableModuleWithStaticCache for text generation task.
  • task (str) — The specific task the exported model will perform, e.g., “text-generation”.
  • recipe (str) — The recipe to guide the export process, e.g., “xnnpack”. Recipes define the optimization and lowering steps. Will raise an exception if the specified recipe is not registered in the recipe registry.
  • output_dir (Union[str, Path]) — Path to the directory where the resulting ExecuTorch model will be saved.
  • **kwargs — Additional configuration options passed to the recipe.

Returns

ExecuTorchProgram

The lowered ExecuTorch program object.

Export a pre-trained PyTorch model to the ExecuTorch format using a specified recipe.

This function facilitates the transformation of a PyTorch model into an optimized ExecuTorch program.

Notes:

The primary export function is designed to be model- and task-independent as well as optimization-agnostic, providing a highly flexible and modular interface for exporting Hugging Face models to the ExecuTorch backend.

This approach highlights the composability of ExecuTorch export pipeline, where dynamically registered task configurations specify how a :hug model is prepared, and recipe configurations encapsulate device-specific optimizations during export. This separation allows users to customize the export process without altering the core function.

For more details on task and recipe configurations, see the Configuration for ExecuTorch Export.

< > Update on GitHub