Reference for ultralytics/utils/export/torchscript.py
Improvements
This page is sourced from https://github.com/ultralytics/ultralytics/blob/main/ultralytics/utils/export/torchscript.py. Have an improvement or example to add? Open a Pull Request — thank you! 🙏
Summary
function ultralytics.utils.export.torchscript.torch2torchscript
def torch2torchscript(
model: torch.nn.Module,
im: torch.Tensor,
file: Path | str,
optimize: bool = False,
metadata: dict | None = None,
prefix: str = "",
) -> Path
Export a PyTorch model to TorchScript format.
Args
| Name | Type | Description | Default |
|---|---|---|---|
model | torch.nn.Module | The PyTorch model to export (may be NMS-wrapped). | required |
im | torch.Tensor | Example input tensor for tracing. | required |
file | Path | str | Source model file path used to derive output path. | required |
optimize | bool | Whether to optimize for mobile deployment. | False |
metadata | dict | None | Optional metadata to embed in the TorchScript archive. | None |
prefix | str | Prefix for log messages. | "" |
Returns
| Type | Description |
|---|---|
Path | Path to the exported .torchscript file. |
Source code in ultralytics/utils/export/torchscript.py
View on GitHubdef torch2torchscript(
model: torch.nn.Module,
im: torch.Tensor,
file: Path | str,
optimize: bool = False,
metadata: dict | None = None,
prefix: str = "",
) -> Path:
"""Export a PyTorch model to TorchScript format.
Args:
model (torch.nn.Module): The PyTorch model to export (may be NMS-wrapped).
im (torch.Tensor): Example input tensor for tracing.
file (Path | str): Source model file path used to derive output path.
optimize (bool): Whether to optimize for mobile deployment.
metadata (dict | None): Optional metadata to embed in the TorchScript archive.
prefix (str): Prefix for log messages.
Returns:
(Path): Path to the exported ``.torchscript`` file.
"""
LOGGER.info(f"\n{prefix} starting export with torch {TORCH_VERSION}...")
file = Path(file)
f = file.with_suffix(".torchscript")
ts = torch.jit.trace(model, im, strict=False)
extra_files = {"config.txt": json.dumps(metadata or {})} # torch._C.ExtraFilesMap()
if optimize: # https://pytorch.org/tutorials/recipes/mobile_interpreter.html
LOGGER.info(f"{prefix} optimizing for mobile...")
from torch.utils.mobile_optimizer import optimize_for_mobile
optimize_for_mobile(ts)._save_for_lite_interpreter(str(f), _extra_files=extra_files)
else:
ts.save(str(f), _extra_files=extra_files)
return f
📅 Created 0 days ago ✏️ Updated 0 days ago