Reference for ultralytics/nn/backends/mnn.py
Improvements
This page is sourced from https://github.com/ultralytics/ultralytics/blob/main/ultralytics/nn/backends/mnn.py. Have an improvement or example to add? Open a Pull Request — thank you! 🙏
class ultralytics.nn.backends.mnn.MNNBackend
MNNBackend()
Bases: BaseBackend
MNN (Mobile Neural Network) inference backend.
Loads and runs inference with MNN models (.mnn files) using the Alibaba MNN framework. Optimized for mobile and edge deployment with configurable thread count and precision.
Methods
| Name | Description |
|---|---|
forward | Run inference using the MNN runtime. |
load_model | Load an Alibaba MNN model from a .mnn file. |
method ultralytics.nn.backends.mnn.MNNBackend.forward
def forward(self, im: torch.Tensor) -> list
Run inference using the MNN runtime.
Args
| Name | Type | Description | Default |
|---|---|---|---|
im | torch.Tensor | Input image tensor in BCHW format, normalized to [0, 1]. | required |
Returns
| Type | Description |
|---|---|
list | Model predictions as a list of numpy arrays. |
Source code in ultralytics/nn/backends/mnn.py
View on GitHubdef forward(self, im: torch.Tensor) -> list:
"""Run inference using the MNN runtime.
Args:
im (torch.Tensor): Input image tensor in BCHW format, normalized to [0, 1].
Returns:
(list): Model predictions as a list of numpy arrays.
"""
input_var = self.expr.const(im.data_ptr(), im.shape)
output_var = self.net.onForward([input_var])
# NOTE: need this copy(), or it'd get incorrect results on ARM devices
return [x.read().copy() for x in output_var]
method ultralytics.nn.backends.mnn.MNNBackend.load_model
def load_model(self, weight: str | Path) -> None
Load an Alibaba MNN model from a .mnn file.
Args
| Name | Type | Description | Default |
|---|---|---|---|
weight | str | Path | Path to the .mnn model file. | required |
Source code in ultralytics/nn/backends/mnn.py
View on GitHubdef load_model(self, weight: str | Path) -> None:
"""Load an Alibaba MNN model from a .mnn file.
Args:
weight (str | Path): Path to the .mnn model file.
"""
LOGGER.info(f"Loading {weight} for MNN inference...")
check_requirements("MNN")
import MNN
config = {"precision": "low", "backend": "CPU", "numThread": (os.cpu_count() + 1) // 2}
rt = MNN.nn.create_runtime_manager((config,))
self.net = MNN.nn.load_module_from_file(weight, [], [], runtime_manager=rt, rearrange=True)
self.expr = MNN.expr
# Load metadata from bizCode
info = self.net.get_info()
if "bizCode" in info:
try:
self.apply_metadata(json.loads(info["bizCode"]))
except json.JSONDecodeError:
pass
📅 Created 0 days ago ✏️ Updated 0 days ago